diff --git a/.buildkite/release.yml b/.buildkite/release.yml index a40a71917..7a9c297cb 100644 --- a/.buildkite/release.yml +++ b/.buildkite/release.yml @@ -1,7 +1,7 @@ steps: - label: Release agents: - image: "golang:1.24.5@sha256:14fd8a55e59a560704e5fc44970b301d00d344e45d6b914dda228e09f359a088" + image: "golang:1.25.1@sha256:8305f5fa8ea63c7b5bc85bd223ccc62941f852318ebfbd22f53bbd0b358c07e1" cpu: "16" memory: "24G" ephemeralStorage: "20G" diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md index c3f5cea54..30682190a 100644 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -36,4 +36,4 @@ If applicable, add screenshots to help explain your problem. - Elasticsearch Version [e.g. 7.16.0] **Additional context** -Add any other context about the problem here. +Add any other context about the problem here. Links to specific affected code files and paths here are also extremely useful (if known). diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md new file mode 100644 index 000000000..234be2e41 --- /dev/null +++ b/.github/copilot-instructions.md @@ -0,0 +1,132 @@ +You will be tasked to fix an issue from an open-source repository. This is a Go based repository hosting a Terrform provider for the elastic stack (elasticsearch and kibana) APIs. This repo currently supports both [plugin framework](https://developer.hashicorp.com/terraform/plugin/framework/getting-started/code-walkthrough) and [sdkv2](https://developer.hashicorp.com/terraform/plugin/sdkv2) resources. Unless you're told otherwise, all new resources _must_ use the plugin framework. + +Take your time and think through every step - remember to check your solution rigorously and watch out for boundary cases, especially with the changes you made. Your solution must be perfect. If not, continue working on it. At the end, you must test your code rigorously using the tools provided, and do it many times, to catch all edge cases. If it is not robust, iterate more and make it perfect. Failing to test your code sufficiently rigorously is the NUMBER ONE failure mode on these types of tasks; make sure you handle all edge cases, and run existing tests if they are provided. + +Please see [README.md](../README.md) and the [CONTRIBUTING.md](../CONTRIBUTING.md) docs before getting started. + +# Workflow + +## High-Level Problem Solving Strategy + +1. Understand the problem deeply. Carefully read the issue and think critically about what is required. +2. Investigate the codebase. Explore relevant files, search for key functions, and gather context. +3. Develop a clear, step-by-step plan. Break down the fix into manageable, incremental steps. +4. Implement the fix incrementally. Make small, testable code changes. +5. Debug as needed. Use debugging techniques to isolate and resolve issues. +6. Test frequently. Run tests after each change to verify correctness. +7. Iterate until the root cause is fixed and all tests pass. +8. Reflect and validate comprehensively. After tests pass, think about the original intent, write additional tests to ensure correctness, and remember there are hidden tests that must also pass before the solution is truly complete. + +Refer to the detailed sections below for more information on each step. + +## 1. Deeply Understand the Problem +Carefully read the issue and think hard about a plan to solve it before coding. Your thinking should be thorough and so it's fine if it's very long. You can think step by step before and after each action you decide to take. + +## 2. Codebase Investigation +- Explore relevant files and directories. +- Search for key functions, classes, or variables related to the issue. +- Read and understand relevant code snippets. +- Identify the root cause of the problem. +- Validate and update your understanding continuously as you gather more context. + +## 3. Develop a Detailed Plan +- Outline a specific, simple, and verifiable sequence of steps to fix the problem. +- Break down the fix into small, incremental changes. + +## 4. Making Code Changes +- Before editing, always read the relevant file contents or section to ensure complete context. +- If a patch is not applied correctly, attempt to reapply it. +- Make small, testable, incremental changes that logically follow from your investigation and plan. + +## 5. Debugging +- Make code changes only if you have high confidence they can solve the problem +- When debugging, try to determine the root cause rather than addressing symptoms +- Debug for as long as needed to identify the root cause and identify a fix +- Use print statements, logs, or temporary code to inspect program state, including descriptive statements or error messages to understand what's happening +- To test hypotheses, you can also add test statements or functions +- Revisit your assumptions if unexpected behavior occurs. +- You MUST iterate and keep going until the problem is solved. + +## 6. Testing +- Run tests frequently using `make test` and `make testacc` +- After each change, verify correctness by running relevant tests. +- If tests fail, analyze failures and revise your patch. +- Write additional tests if needed to capture important behaviors or edge cases. +- NEVER accept acceptance tests that have been skipped due to environment issues; always ensure the environment is correctly set up and all tests run successfully. + +### 6.1 Acceptance Testing Requirements +When running acceptance tests, ensure the following: + +- **Environment Variables** - The following environment variables are required for acceptance tests: + - `ELASTICSEARCH_ENDPOINTS` (default: http://localhost:9200) + - `ELASTICSEARCH_USERNAME` (default: elastic) + - `ELASTICSEARCH_PASSWORD` (default: password) + - `KIBANA_ENDPOINT` (default: http://localhost:5601) + - `TF_ACC` (must be set to "1" to enable acceptance tests) +- **Run targeted tests using `go test`** - Ensure the required environment variables are explicitly defined when running targeted tests. Example: + ```bash + ELASTICSEARCH_ENDPOINTS=http://localhost:9200 ELASTICSEARCH_USERNAME=elastic ELASTICSEARCH_PASSWORD=password KIBANA_ENDPOINT=http://localhost:5601 TF_ACC=1 go test -v -run TestAccResourceName ./path/to/testfile.go + ``` + +## 7. Final Verification +- Confirm the root cause is fixed. +- Review your solution for logic correctness and robustness. +- Iterate until you are extremely confident the fix is complete and all tests pass. +- Run the acceptance tests for any changed resources. Ensure acceptance tests pass without any environment-related skips. Use `make testacc` to verify this, explicitly defining the required environment variables. +- Run `make lint` to ensure any linting errors have not surfaced with your changes. This task may automatically correct any linting errors, and regenerate documentation. Include any changes in your commit. + +## 8. Final Reflection and Additional Testing +- Reflect carefully on the original intent of the user and the problem statement. +- Think about potential edge cases or scenarios that may not be covered by existing tests. +- Write additional tests that would need to pass to fully validate the correctness of your solution. +- Run these new tests and ensure they all pass. +- Be aware that there are additional hidden tests that must also pass for the solution to be successful. +- Do not assume the task is complete just because the visible tests pass; continue refining until you are confident the fix is robust and comprehensive. + +## 9. Before Submitting Pull Requests +- Run `make docs-generate` to update the documentation, and ensure the results of this command make it into your pull request. + +## Repository Structure + +• **docs/** - Documentation files + • **data-sources/** - Documentation for Terraform data sources + • **guides/** - User guides and tutorials + • **resources/** - Documentation for Terraform resources +• **examples/** - Example Terraform configurations + • **cloud/** - Examples using the cloud to launch testing stacks + • **data-sources/** - Data source usage examples + • **resources/** - Resource usage examples + • **provider/** - Provider configuration examples +• **generated/** - Auto-generated clients from the `generate-clients` make target + • **alerting/** - Kibana alerting API client + • **connectors/** - Kibana connectors API client + • **kbapi/** - Kibana API client + • **slo/** - SLO (Service Level Objective) API client +• **internal/** - Internal Go packages + • **acctest/** - Acceptance test utilities + • **clients/** - API client implementations + • **elasticsearch/** - Elasticsearch-specific logic + • **fleet/** - Fleet management functionality + • **kibana/** - Kibana-specific logic + • **models/** - Data models and structures + • **schema/** - Connection schema definitions for plugin framework + • **utils/** - Utility functions + • **versionutils/** - Version handling utilities +• **libs/** - External libraries + • **go-kibana-rest/** - Kibana REST API client library +• **provider/** - Core Terraform provider implementation +• **scripts/** - Utility scripts for development and CI +• **templates/** - Template files for documentation generation + • **data-sources/** - Data source documentation templates + • **resources/** - Resource documentation templates + • **guides/** - Guide documentation templates +• **xpprovider/** - Additional provider functionality needed for Crossplane + +## Key Guidelines +* Follow Go best practices and idiomatic patterns +* Maintain existing code structure and organization +* Write unit tests for new functionality. Use table-driven unit tests when possible. +* When creating a new Plugin Framework based resource, follow the code organisation of `internal/elasticsearch/security/system_user` +* Avoid adding any extra functionality into the `utils` package, instead preferencing adding to a more specific package or creating one to match the purpose +* Think through your planning first using the codebase as your guide before creating new resources and data sources + diff --git a/.github/workflows/copilot-setup-steps.yml b/.github/workflows/copilot-setup-steps.yml index d9b9ccac6..17d5e3290 100644 --- a/.github/workflows/copilot-setup-steps.yml +++ b/.github/workflows/copilot-setup-steps.yml @@ -21,66 +21,10 @@ jobs: permissions: # If you want to clone the repository as part of your setup steps, for example to install dependencies, you'll need the `contents: read` permission. If you don't clone the repository in your setup steps, Copilot will do this for you automatically after the steps complete. contents: read - env: - ELASTICSEARCH_ENDPOINTS: "http://localhost:9200" - ELASTICSEARCH_USERNAME: "elastic" - ELASTICSEARCH_PASSWORD: password - KIBANA_ENDPOINT: "http://localhost:5601" - KIBANA_USERNAME: "elastic" - KIBANA_PASSWORD: password - KIBANA_SYSTEM_USERNAME: kibana_system - KIBANA_SYSTEM_PASSWORD: password - TF_ACC: "1" - services: - elasticsearch: - image: docker.elastic.co/elasticsearch/elasticsearch:9.0.3 - env: - discovery.type: single-node - xpack.security.enabled: true - xpack.security.authc.api_key.enabled: true - xpack.security.authc.token.enabled: true - xpack.watcher.enabled: true - xpack.license.self_generated.type: trial - repositories.url.allowed_urls: https://example.com/* - path.repo: /tmp - ELASTIC_PASSWORD: ${{ env.ELASTICSEARCH_PASSWORD }} - ports: - - 9200:9200 - options: --health-cmd="curl http://localhost:9200/_cluster/health" --health-interval=10s --health-timeout=5s --health-retries=10 - kibana: - image: docker.elastic.co/kibana/kibana:9.0.3 - env: - SERVER_NAME: kibana - ELASTICSEARCH_HOSTS: http://elasticsearch:9200 - ELASTICSEARCH_USERNAME: ${{ env.KIBANA_SYSTEM_USERNAME }} - ELASTICSEARCH_PASSWORD: ${{ env.KIBANA_SYSTEM_PASSWORD }} - XPACK_ENCRYPTEDSAVEDOBJECTS_ENCRYPTIONKEY: a7a6311933d3503b89bc2dbc36572c33a6c10925682e591bffcab6911c06786d - # LOGGING_ROOT_LEVEL: debug - ports: - - 5601:5601 - options: --health-cmd="curl http://localhost:5601/api/status" --health-interval=10s --health-timeout=5s --health-retries=10 - fleet: - image: docker.elastic.co/elastic-agent/elastic-agent:9.0.3 - env: - SERVER_NAME: fleet - FLEET_ENROLL: "1" - FLEET_URL: https://fleet:8220 - FLEET_INSECURE: "true" - FLEET_SERVER_ENABLE: "1" - FLEET_SERVER_POLICY_ID: fleet-server - FLEET_SERVER_ELASTICSEARCH_HOST: http://elasticsearch:9200 - FLEET_SERVER_ELASTICSEARCH_INSECURE: "true" - FLEET_SERVER_INSECURE_HTTP: "true" - KIBANA_HOST: http://kibana:5601 - KIBANA_FLEET_SETUP: "1" - KIBANA_FLEET_PASSWORD: ${{ env.ELASTICSEARCH_PASSWORD }} - ports: - - 8220:8220 - options: --restart="unless-stopped" steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4 - - uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5 + - uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6 with: go-version-file: 'go.mod' cache: true @@ -88,6 +32,9 @@ jobs: with: terraform_wrapper: false + - name: Setup Elastic Stack + run: make docker-fleet + - name: Get dependencies run: make setup diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml index 4f7808896..5e23596ad 100644 --- a/.github/workflows/test.yml +++ b/.github/workflows/test.yml @@ -18,8 +18,8 @@ jobs: runs-on: ubuntu-latest timeout-minutes: 5 steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4 - - uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5 + - uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6 with: go-version-file: 'go.mod' cache: true @@ -34,8 +34,8 @@ jobs: name: Lint runs-on: ubuntu-latest steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4 - - uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5 + - uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6 with: go-version-file: 'go.mod' cache: true @@ -45,7 +45,7 @@ jobs: terraform_wrapper: false - name: Lint - run: make lint + run: make check-lint test: name: Matrix Acceptance Test @@ -126,10 +126,11 @@ jobs: - '8.15.5' - '8.16.2' - '8.17.0' + - '8.18.3' - '9.0.3' steps: - - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4 - - uses: actions/setup-go@d35c59abb061a4a6fb18e82ac0862c26744d6ab5 # v5 + - uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5 + - uses: actions/setup-go@44694675825211faa026b3c33043df3e48a5fa00 # v6 with: go-version-file: 'go.mod' cache: true diff --git a/.gitignore b/.gitignore index 3c429c9dd..bcfa6a1c2 100644 --- a/.gitignore +++ b/.gitignore @@ -26,6 +26,7 @@ website/node_modules *.test *.iml *.vscode +__debug_* website/vendor diff --git a/CHANGELOG.md b/CHANGELOG.md index 2bcd4541b..32b53502b 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,27 @@ ## [Unreleased] +- Create `elasticstack_kibana_maintenance_window` resource. ([#1224](https://github.com/elastic/terraform-provider-elasticstack/pull/1224)) +- Add support for `solution` field in `elasticstack_kibana_space` resource and data source ([#1102](https://github.com/elastic/terraform-provider-elasticstack/issues/1102)) +- Add `slo_id` validation to `elasticstack_kibana_slo` ([#1221](https://github.com/elastic/terraform-provider-elasticstack/pull/1221)) +- Add `ignore_missing_component_templates` to `elasticstack_elasticsearch_index_template` ([#1206](https://github.com/elastic/terraform-provider-elasticstack/pull/1206)) +- Migrate `elasticstack_elasticsearch_enrich_policy` resource and data source to Terraform Plugin Framework ([#1220](https://github.com/elastic/terraform-provider-elasticstack/pull/1220)) +- Prevent provider panic when a script exists in state, but not in Elasticsearch ([#1218](https://github.com/elastic/terraform-provider-elasticstack/pull/1218)) +- Add support for managing cross_cluster API keys in `elasticstack_elasticsearch_security_api_key` ([#1252](https://github.com/elastic/terraform-provider-elasticstack/pull/1252)) +- Allow version changes without a destroy/create cycle with `elasticstack_fleet_integration` ([#1255](https://github.com/elastic/terraform-provider-elasticstack/pull/1255)). This fixes an issue where it was impossible to upgrade integrations which are used by an integration policy. +- Add `namespace` attribute to `elasticstack_kibana_synthetics_monitor` resource to support setting data stream namespace independently from `space_id` ([#1247](https://github.com/elastic/terraform-provider-elasticstack/pull/1247)) +- Support setting an explit `connector_id` in `elasticstack_kibana_action_connector`. This attribute already existed, but was being ignored by the provider. Setting the attribute will return an error in Elastic Stack v8.8 and lower since creating a connector with an explicit ID is not supported. ([1260](https://github.com/elastic/terraform-provider-elasticstack/pull/1260)) +- Migrate `elasticstack_kibana_action_connector` to the Terraform plugin framework ([#1269](https://github.com/elastic/terraform-provider-elasticstack/pull/1269)) +- Migrate `elasticstack_elasticsearch_security_role_mapping` resource and data source to Terraform Plugin Framework ([#1279](https://github.com/elastic/terraform-provider-elasticstack/pull/1279)) +- Add support for `inactivity_timeout` in `elasticstack_fleet_agent_policy` ([#641](https://github.com/elastic/terraform-provider-elasticstack/issues/641)) +- Migrate `elasticstack_elasticsearch_script` resource to Terraform Plugin Framework ([#1297](https://github.com/elastic/terraform-provider-elasticstack/pull/1297)) +- Add support for `kafka` output types in `elasticstack_fleet_output` ([#1302](https://github.com/elastic/terraform-provider-elasticstack/pull/1302)) +- Add support for `prevent_initial_backfill` to `elasticstack_kibana_slo` ([#1071](https://github.com/elastic/terraform-provider-elasticstack/pull/1071)) +- [Refactor] Regenerate the SLO client using the current OpenAPI spec ([#1303](https://github.com/elastic/terraform-provider-elasticstack/pull/1303)) +- Add support for `data_view_id` in the `elasticstack_kibana_slo` resource ([#1305](https://github.com/elastic/terraform-provider-elasticstack/pull/1305)) +- Add support for `unenrollment_timeout` in `elasticstack_fleet_agent_policy` ([#1169](https://github.com/elastic/terraform-provider-elasticstack/issues/1169)) +- Handle default value for `allow_restricted_indices` in `elasticstack_elasticsearch_security_api_key` ([#1315](https://github.com/elastic/terraform-provider-elasticstack/pull/1315)) +- Fixed `nil` reference in kibana synthetics API client in case of response errors ([#1320](https://github.com/elastic/terraform-provider-elasticstack/pull/1320)) + ## [0.11.17] - 2025-07-21 - Add `elasticstack_apm_agent_configuration` resource ([#1196](https://github.com/elastic/terraform-provider-elasticstack/pull/1196)) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 24b7a27b0..a66169b62 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,60 +1,136 @@ -# Typical development workflow +# Contributing -Fork the repo, work on an issue +This guide explains how to set up your environment, make changes, and submit a PR. -## Updating the generated Kibana client. +## Development Setup -If your work involves the Kibana API, the endpoints may or may not be included in the generated client. -Check [generated/kbapi](./generated/kbapi/) for more details. +* Fork and clone the repo. +* Setup your preferred IDE (IntelliJ, VSCode, etc.) -## Acceptance tests +Requirements: +* [Terraform](https://www.terraform.io/downloads.html) >= 1.0.0 +* [Go](https://golang.org/doc/install) >= 1.25 +* Docker (for acceptance tests) -```bash -make docker-testacc -``` +## Development Workflow -Run a single test with terraform debug enabled: -```bash -env TF_LOG=DEBUG make docker-testacc TESTARGS='-run ^TestAccResourceDataStreamLifecycle$$' -``` +* Create a new branch for your changes. +* Make your changes. See [Useful Commands](#useful-commands) and [Debugging](#running--debugging-the-provider). +* Validate your changes + * Run unit and acceptance tests (See [Running Acceptance Tests](#running-acceptance-tests)). + * Run `make lint` to check linting and formatting. For this check to succeed, all changes must have been committed. + * All checks also run automatically on every PR. +* Submit your PR for review. +* Add a changelog entry in `CHANGELOG.md` under the `Unreleased` section. This will be included in the release notes of the next release. The changelog entry references the PR, so it has to be added after the PR has been opened. -A way to forward debug logs to a file: -```bash -env TF_ACC_LOG_PATH=/tmp/tf.log TF_ACC_LOG=DEBUG TF_LOG=DEBUG make docker-testacc -``` +When creating new resources: +* Use the [Plugin Framework](https://developer.hashicorp.com/terraform/plugin/framework/getting-started/code-walkthrough) for new resources. + * Use an existing resource (e.g. `internal/elasticsearch/security/system_user`) as a template. + * Some resources use the deprecated Terraform SDK, so only resources using the new Terraform Framework should be used as reference. +* Use the generated API clients to interact with the Kibana APIs. (See [Working with Generated API Clients](#working-with-generated-api-clients) +* Add a documentation template and examples for the resource. See [Updating Documentation](#updating-documentation) for more details. +* Write unit and acceptance tests. +### Useful Commands -## Update documentation +* `make build`: Build the provider. +* `make lint`: Lints and formats the code. +* `make test`: Run unit tests. +* `make docs-generate`: Generate documentation. +* [Running & Debugging the Provider](#running--debugging-the-provider) +* [Running Acceptance Tests](#running-acceptance-tests) -Update documentation templates in `./templates` directory and re-generate docs via: -```bash -make docs-generate -``` +### Running & Debugging the Provider -## Update `./CHANGELOG.md` +You can run the currently checked-out code for local testing and use it with Terraform. -List of previous commits is a good example of what should be included in the changelog. +Also see [Terraform docs on debugging](https://developer.hashicorp.com/terraform/plugin/debugging#starting-a-provider-in-debug-mode). +Run the provider in debug mode and reattach the provider in Terraform: +* Launch `main.go` with the `-debug` flag from your IDE. + * Or launch it with `go run main.go -debug` from the command line. +* After launching, the provider will print an env var. Copy the printed `TF_REATTACH_PROVIDERS='{…}'` value. +* Export it in your shell where you run Terraform: `export TF_REATTACH_PROVIDERS='{…}'`. +* Terraform will now talk to your debug instance, and you can set breakpoints. -## Pull request +### Running Acceptance Tests -Format the code before pushing: -```bash -make fmt -``` +Acceptance tests spin up Elasticsearch, Kibana, and Fleet with Docker and run tests in a Go container. -Check if the linting: ```bash -make lint -``` +# Start Elasticsearch, Kibana, and Fleet +make docker-fleet -Create a PR and check acceptance test matrix is green. +# Run all tests +make testacc -## Run provider with local terraform +# Run a specific test +make testacc TESTARGS='-run ^TestAccResourceDataStreamLifecycle$$' -TBD +# Cleanup created docker containers +make docker-clean +``` -## Releasing +### Working with Generated API Clients + +If your work involves the Kibana API, the API client can be generated directly from the Kibana OpenAPI specs: +- For Kibana APIs, use the generated client in `generated/kbapi`. +- To add new endpoints, see [generated/kbapi/README.md](generated/kbapi/README.md). +- Regenerate clients with: + ```sh + make transform generate + ``` + +The codebase includes a number of deprecated clients which should not be used anymore: +- `libs/go-kibana-rest`: Fork of an external library, which is not maintained anymore. +- `generated/alerting`, `generated/connectors`, `generated/slo`: Older generated clients, but based on non-standard specs. If any of these APIs are needed, they should be included in the `kbapi` client. + +### Updating Documentation + +Docs are generated from templates in `templates/` and examples in `examples/`. +* Update or add templates and examples. +* Run `make docs-generate` to produce files under `docs/`. +* Commit the generated files. `make lint` will fail if docs are stale. + +## Project Structure + +A quick overview over what's in each folder: + +* `docs/` - Documentation files + * `data-sources/` - Documentation for Terraform data sources + * `guides/` - User guides and tutorials + * `resources/` - Documentation for Terraform resources +* `examples/` - Example Terraform configurations + * `cloud/` - Examples using the cloud to launch testing stacks + * `data-sources/` - Data source usage examples + * `resources/` - Resource usage examples + * `provider/` - Provider configuration examples +* `generated/` - Auto-generated clients from the `generate-clients` make target + * `kbapi/` - Kibana API client + * `alerting/` - (Deprecated) Kibana alerting API client + * `connectors/` - (Deprecated) Kibana connectors API client + * `slo/` - (Deprecated) SLO (Service Level Objective) API client +* `internal/` - Internal Go packages + * `acctest/` - Acceptance test utilities + * `clients/` - API client implementations + * `elasticsearch/` - Elasticsearch-specific logic + * `fleet/` - Fleet management functionality + * `kibana/` - Kibana-specific logic + * `models/` - Data models and structures + * `schema/` - Connection schema definitions for plugin framework + * `utils/` - Utility functions + * `versionutils/` - Version handling utilities +* `libs/` - External libraries + * `go-kibana-rest/` - (Deprecated) Kibana REST API client library +* `provider/` - Core Terraform provider implementation +* `scripts/` - Utility scripts for development and CI +* `templates/` - Template files for documentation generation + * `data-sources/` - Data source documentation templates + * `resources/` - Resource documentation templates + * `guides/` - Guide documentation templates +* `xpprovider/` - Additional provider functionality needed for Crossplane + +## Releasing (maintainers) Releasing is implemented in CI pipeline. @@ -65,4 +141,4 @@ To release a new provider version: - updates CHANGELOG.md with the list of changes being released. [Example](https://github.com/elastic/terraform-provider-elasticstack/commit/be866ebc918184e843dc1dd2f6e2e1b963da386d). -* Once the PR is merged, the release CI pipeline can be started by pushing a new release tag to the `main` branch. +* Once the PR is merged, the release CI pipeline can be started by pushing a new release tag to the `main` branch. (`git tag v0.11.13 && git push origin v0.11.13`) diff --git a/Makefile b/Makefile index f4d3ccfe4..1e4f374e4 100644 --- a/Makefile +++ b/Makefile @@ -52,7 +52,6 @@ build-ci: ## build the terraform provider .PHONY: build build: lint build-ci ## build the terraform provider - .PHONY: testacc testacc: ## Run acceptance tests TF_ACC=1 go test -v ./... -count $(ACCTEST_COUNT) -parallel $(ACCTEST_PARALLELISM) $(TESTARGS) -timeout $(ACCTEST_TIMEOUT) @@ -225,7 +224,7 @@ docker-clean: ## Try to remove provisioned nodes and assigned network .PHONY: docs-generate docs-generate: tools ## Generate documentation for the provider - @ go tool github.com/hashicorp/terraform-plugin-docs/cmd/tfplugindocs + @ go tool github.com/hashicorp/terraform-plugin-docs/cmd/tfplugindocs generate --provider-name terraform-provider-elasticstack .PHONY: gen @@ -246,7 +245,7 @@ install: build ## Install built provider into the local terraform cache .PHONY: tools tools: $(GOBIN) ## Download golangci-lint locally if necessary. - @[[ -f $(GOBIN)/golangci-lint ]] || curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b $(GOBIN) v2.2.2 + @[[ -f $(GOBIN)/golangci-lint ]] || curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | sh -s -- -b $(GOBIN) v2.4.0 .PHONY: golangci-lint golangci-lint: @@ -254,7 +253,10 @@ golangci-lint: .PHONY: lint -lint: setup golangci-lint check-fmt check-docs ## Run lints to check the spelling and common go patterns +lint: setup golangci-lint fmt docs-generate ## Run lints to check the spelling and common go patterns + +.PHONY: check-lint +check-lint: setup golangci-lint check-fmt check-docs .PHONY: fmt fmt: ## Format code @@ -350,7 +352,7 @@ generate-slo-client: tools ## generate Kibana slo client -o /local/generated/slo \ --type-mappings=float32=float64 @ rm -rf generated/slo/go.mod generated/slo/go.sum generated/slo/test - @ go fmt ./generated/... + @ go fmt ./generated/slo/... .PHONY: generate-clients generate-clients: generate-alerting-client generate-slo-client generate-connectors-client ## generate all clients diff --git a/README.md b/README.md index 9adefc90d..1de54e1c0 100644 --- a/README.md +++ b/README.md @@ -76,64 +76,9 @@ provider "elasticstack" { } ``` - ## Developing the Provider -If you wish to work on the provider, you'll first need [Go](http://www.golang.org) installed on your machine (see [Requirements](#requirements)). - -To compile the provider, run `go install`. This will build the provider and put the provider binary in the `$GOPATH/bin` directory. - -To install the provider locally into the `~/.terraform.d/plugins/...` directory one can use `make install` command. This will allow to refer this provider directly in the Terraform configuration without needing to download it from the registry. - -To generate or update documentation, run `make gen`. All the generated docs will have to be committed to the repository as well. - -In order to run the full suite of Acceptance tests, run `make testacc`. - -If you have [Docker](https://docs.docker.com/get-docker/) installed, you can use following command to start the Elasticsearch container and run Acceptance tests against it: - -```sh -$ make docker-testacc -``` - -To clean up the used containers and to free up the assigned container names, run `make docker-clean`. - -Note: there have been some issues encountered when using `tfenv` for local development. It's recommended you move your version management for terraform to `asdf` instead. - - -### Requirements - -- [Terraform](https://www.terraform.io/downloads.html) >= 1.0.0 -- [Go](https://golang.org/doc/install) >= 1.19 - - -### Building The Provider - -1. Clone the repository -1. Enter the repository directory -1. Build the provider using the `make install` command: -```sh -$ make install -``` - - -### Adding Dependencies - -This provider uses [Go modules](https://github.com/golang/go/wiki/Modules). -Please see the Go documentation for the most up to date information about using Go modules. - -To add a new dependency `github.com/author/dependency` to your Terraform provider: - -``` -go get github.com/author/dependency -go mod tidy -``` - -Then commit the changes to `go.mod` and `go.sum`. - -### Generating Kibana clients - -Kibana clients for some APIs are generated based on Kibana OpenAPI specs. -Please see [Makefile](./Makefile) tasks for more details. +See [CONTRIBUTING.md](CONTRIBUTING.md) ## Support diff --git a/docs/data-sources/elasticsearch_enrich_policy.md b/docs/data-sources/elasticsearch_enrich_policy.md index be4f00bb7..b28cc69ad 100644 --- a/docs/data-sources/elasticsearch_enrich_policy.md +++ b/docs/data-sources/elasticsearch_enrich_policy.md @@ -1,13 +1,15 @@ + --- -subcategory: "Enrich" +# generated by https://github.com/hashicorp/terraform-plugin-docs page_title: "elasticstack_elasticsearch_enrich_policy Data Source - terraform-provider-elasticstack" +subcategory: "Enrich" description: |- - Returns information about an enrich policy. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/get-enrich-policy-api.html + Returns information about an enrich policy. See the enrich policy API documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/get-enrich-policy-api.html for more details. --- -# Data Source: elasticstack_elasticsearch_enrich_policy +# elasticstack_elasticsearch_enrich_policy (Data Source) -Returns information about an enrich policy. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/get-enrich-policy-api.html +Returns information about an enrich policy. See the [enrich policy API documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/get-enrich-policy-api.html) for more details. ## Example Usage @@ -74,11 +76,35 @@ output "query" { - `name` (String) The name of the policy. +### Optional + +- `elasticsearch_connection` (Block List, Deprecated) Elasticsearch connection configuration block. (see [below for nested schema](#nestedblock--elasticsearch_connection)) + ### Read-Only - `enrich_fields` (Set of String) Fields to add to matching incoming documents. These fields must be present in the source indices. - `id` (String) Internal identifier of the resource - `indices` (Set of String) Array of one or more source indices used to create the enrich index. -- `match_field` (String) Field in source indices used to match incoming documents. +- `match_field` (String) Field from the source indices used to match incoming documents. - `policy_type` (String) The type of enrich policy, can be one of geo_match, match, range. - `query` (String) Query used to filter documents in the enrich index. The policy only uses documents matching this query to enrich incoming documents. Defaults to a match_all query. + + +### Nested Schema for `elasticsearch_connection` + +Optional: + +- `api_key` (String, Sensitive) API Key to use for authentication to Elasticsearch +- `bearer_token` (String, Sensitive) Bearer Token to use for authentication to Elasticsearch +- `ca_data` (String) PEM-encoded custom Certificate Authority certificate +- `ca_file` (String) Path to a custom Certificate Authority certificate +- `cert_data` (String) PEM encoded certificate for client auth +- `cert_file` (String) Path to a file containing the PEM encoded certificate for client auth +- `endpoints` (List of String, Sensitive) A list of endpoints where the terraform provider will point to, this must include the http(s) schema and port number. +- `es_client_authentication` (String, Sensitive) ES Client Authentication field to be used with the JWT token +- `headers` (Map of String, Sensitive) A list of headers to be sent with each request to Elasticsearch. +- `insecure` (Boolean) Disable TLS certificate validation +- `key_data` (String, Sensitive) PEM encoded private key for client auth +- `key_file` (String) Path to a file containing the PEM encoded private key for client auth +- `password` (String, Sensitive) Password to use for API authentication to Elasticsearch. +- `username` (String) Username to use for API authentication to Elasticsearch. diff --git a/docs/data-sources/elasticsearch_index_template.md b/docs/data-sources/elasticsearch_index_template.md index c1363c3cd..da0da4488 100644 --- a/docs/data-sources/elasticsearch_index_template.md +++ b/docs/data-sources/elasticsearch_index_template.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_index_template Data Source - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_index_template Data Source" description: |- - Retrieves index template. + Retrieves information about an existing index template definition. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-template.html --- -# Data Source: elasticstack_elasticsearch_index_template +# elasticstack_elasticsearch_index_template (Data Source) -Use this data source to retrieve information about existing Elasticsearch index templates. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-template.html +Retrieves information about an existing index template definition. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-template.html ## Example Usage @@ -42,6 +43,7 @@ output "template" { - `composed_of` (List of String) An ordered list of component template names. - `data_stream` (List of Object) If this object is included, the template is used to create data streams and their backing indices. Supports an empty object. (see [below for nested schema](#nestedatt--data_stream)) - `id` (String) Internal identifier of the resource +- `ignore_missing_component_templates` (List of String) A list of component template names that are ignored if missing. - `index_patterns` (Set of String) Array of wildcard (*) expressions used to match the names of data streams and indices during creation. - `metadata` (String) Optional user metadata about the index template. - `priority` (Number) Priority to determine index template precedence when a new data stream or index is created. diff --git a/docs/data-sources/elasticsearch_indices.md b/docs/data-sources/elasticsearch_indices.md index 36ad08c9d..1a0fdb5f4 100644 --- a/docs/data-sources/elasticsearch_indices.md +++ b/docs/data-sources/elasticsearch_indices.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_indices Data Source - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_indices Data Source" description: |- - Retrieves indices. + Retrieves information about existing Elasticsearch indices. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-index.html --- -# Data Source: elasticstack_elasticsearch_indices +# elasticstack_elasticsearch_indices (Data Source) -Use this data source to retrieve and get information about existing Elasticsearch indices. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-index.html +Retrieves information about existing Elasticsearch indices. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-index.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_info.md b/docs/data-sources/elasticsearch_info.md index 2ea14d1d8..1545d9c1b 100644 --- a/docs/data-sources/elasticsearch_info.md +++ b/docs/data-sources/elasticsearch_info.md @@ -1,14 +1,15 @@ + --- -subcategory: "Cluster" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_info Data Source" +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_info Data Source - terraform-provider-elasticstack" +subcategory: "Elasticsearch" description: |- - Gets information about the Elasticsearch cluster. + Gets information about the Elastic cluster. --- -# Data Source: elasticstack_elasticsearch_info +# elasticstack_elasticsearch_info (Data Source) -This data source provides the information about the configured Elasticsearch cluster +Gets information about the Elastic cluster. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_append.md b/docs/data-sources/elasticsearch_ingest_processor_append.md index f33f66eb2..da6750d1e 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_append.md +++ b/docs/data-sources/elasticsearch_ingest_processor_append.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_append Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_append Data Source" description: |- - Helper data source to create a processor which appends one or more values to an existing array if the field already exists and it is an array. + Helper data source which can be used to create the configuration for an append processor. This processor appends one or more values to an existing array if the field already exists and it is an array. Converts a scalar to an array and appends one or more values to it if the field exists and it is a scalar. Creates an array containing the provided values if the field doesn’t exist. See the append processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/append-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_append - -Helper data source to which can be used to create a processor to append one or more values to an existing array if the field already exists and it is an array. -Converts a scalar to an array and appends one or more values to it if the field exists and it is a scalar. Creates an array containing the provided values if the field doesn’t exist. +# elasticstack_elasticsearch_ingest_processor_append (Data Source) -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/append-processor.html +Helper data source which can be used to create the configuration for an append processor. This processor appends one or more values to an existing array if the field already exists and it is an array. Converts a scalar to an array and appends one or more values to it if the field exists and it is a scalar. Creates an array containing the provided values if the field doesn’t exist. See the [append processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/append-processor.html) for more details. ## Example Usage @@ -56,4 +54,3 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_bytes.md b/docs/data-sources/elasticsearch_ingest_processor_bytes.md index 692f8b225..4f7b5782e 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_bytes.md +++ b/docs/data-sources/elasticsearch_ingest_processor_bytes.md @@ -1,18 +1,21 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_bytes Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_bytes Data Source" description: |- - Helper data source to create a processor which converts a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). + Helper data source which can be used to create the configuration for a bytes processor. The processor converts a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). See the bytes processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/bytes-processor.html for more details. + If the field is an array of strings, all members of the array will be converted. + Supported human readable units are "b", "kb", "mb", "gb", "tb", "pb" case insensitive. An error will occur if the field is not a supported format or resultant value exceeds 2^63. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_bytes +# elasticstack_elasticsearch_ingest_processor_bytes (Data Source) -Helper data source to which can be used to create a processor to convert a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). If the field is an array of strings, all members of the array will be converted. +Helper data source which can be used to create the configuration for a bytes processor. The processor converts a human readable byte value (e.g. 1kb) to its value in bytes (e.g. 1024). See the [bytes processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/bytes-processor.html) for more details. -Supported human readable units are "b", "kb", "mb", "gb", "tb", "pb" case insensitive. An error will occur if the field is not a supported format or resultant value exceeds 2^63. +If the field is an array of strings, all members of the array will be converted. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/bytes-processor.html +Supported human readable units are "b", "kb", "mb", "gb", "tb", "pb" case insensitive. An error will occur if the field is not a supported format or resultant value exceeds 2^63. ## Example Usage @@ -55,4 +58,3 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_circle.md b/docs/data-sources/elasticsearch_ingest_processor_circle.md index a526a56f5..b5784a6d1 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_circle.md +++ b/docs/data-sources/elasticsearch_ingest_processor_circle.md @@ -1,16 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_circle Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_circle Data Source" description: |- - Helper data source to create a processor which converts circle definitions of shapes to regular polygons which approximate them. + Helper data source which can be used to create the configuration for an circle processor. This processor converts circle definitions of shapes to regular polygons which approximate them. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-circle-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_circle - -Helper data source to which can be used to create a processor to convert circle definitions of shapes to regular polygons which approximate them. +# elasticstack_elasticsearch_ingest_processor_circle (Data Source) -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-circle-processor.html +Helper data source which can be used to create the configuration for an circle processor. This processor converts circle definitions of shapes to regular polygons which approximate them. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-circle-processor.html ## Example Usage @@ -57,4 +56,3 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_community_id.md b/docs/data-sources/elasticsearch_ingest_processor_community_id.md index bb9bda3d1..c7376d59c 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_community_id.md +++ b/docs/data-sources/elasticsearch_ingest_processor_community_id.md @@ -1,20 +1,21 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_community_id Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_community_id Data Source" description: |- - Helper data source to create a processor which computes the Community ID for network flow data as defined in the Community ID Specification. + Helper data source which can be used to create the configuration for a community ID processor. This processor computes the Community ID for network flow data as defined in the Community ID Specification. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/community-id-processor.html + You can use a community ID to correlate network events related to a single flow. + The community ID processor reads network flow data from related Elastic Common Schema (ECS) https://www.elastic.co/guide/en/ecs/1.12 fields by default. If you use the ECS, no configuration is required. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_community_id +# elasticstack_elasticsearch_ingest_processor_community_id (Data Source) -Helper data source to which can be used to create a processor to compute the Community ID for network flow data as defined in the [Community ID Specification](https://github.com/corelight/community-id-spec). +Helper data source which can be used to create the configuration for a community ID processor. This processor computes the Community ID for network flow data as defined in the Community ID Specification. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/community-id-processor.html You can use a community ID to correlate network events related to a single flow. The community ID processor reads network flow data from related [Elastic Common Schema (ECS)](https://www.elastic.co/guide/en/ecs/1.12) fields by default. If you use the ECS, no configuration is required. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/community-id-processor.html - ## Example Usage ```terraform @@ -59,4 +60,3 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_convert.md b/docs/data-sources/elasticsearch_ingest_processor_convert.md index 25f6d94be..8e120aa6f 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_convert.md +++ b/docs/data-sources/elasticsearch_ingest_processor_convert.md @@ -1,25 +1,37 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_convert Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_convert Data Source" description: |- - Helper data source to create a processor which converts a field in the currently ingested document to a different type, such as converting a string to an integer. + Helper data source which can be used to create the configuration for a convert processor. This processor converts a field in the currently ingested document to a different type, such as converting a string to an integer. See the convert processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/convert-processor.html for more details. + The supported types include: + integerlongfloatdoublestringbooleanipauto + Specifying boolean will set the field to true if its string value is equal to true (ignoring case), to false if its string value is equal to false (ignoring case), or it will throw an exception otherwise. + Specifying ip will set the target field to the value of field if it contains a valid IPv4 or IPv6 address that can be indexed into an IP field type. + Specifying auto will attempt to convert the string-valued field into the closest non-string, non-IP type. For example, a field whose value is "true" will be converted to its respective boolean type: true. Do note that float takes precedence of double in auto. A value of "242.15" will "automatically" be converted to 242.15 of type float. If a provided field cannot be appropriately converted, the processor will still process successfully and leave the field value as-is. In such a case, target_field will be updated with the unconverted field value. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_convert +# elasticstack_elasticsearch_ingest_processor_convert (Data Source) -Helper data source to which can be used to convert a field in the currently ingested document to a different type, such as converting a string to an integer. If the field value is an array, all members will be converted. +Helper data source which can be used to create the configuration for a convert processor. This processor converts a field in the currently ingested document to a different type, such as converting a string to an integer. See the [convert processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/convert-processor.html) for more details. -The supported types include: `integer`, `long`, `float`, `double`, `string`, `boolean`, `ip`, and `auto`. +The supported types include: +- `integer` +- `long` +- `float` +- `double` +- `string` +- `boolean` +- `ip` +- `auto` -Specifying `boolean` will set the field to true if its string value is equal to true (ignore case), to false if its string value is equal to false (ignore case), or it will throw an exception otherwise. +Specifying `boolean` will set the field to true if its string value is equal to true (ignoring case), to false if its string value is equal to false (ignoring case), or it will throw an exception otherwise. Specifying `ip` will set the target field to the value of `field` if it contains a valid IPv4 or IPv6 address that can be indexed into an IP field type. Specifying `auto` will attempt to convert the string-valued `field` into the closest non-string, non-IP type. For example, a field whose value is "true" will be converted to its respective boolean type: true. Do note that float takes precedence of double in auto. A value of "242.15" will "automatically" be converted to 242.15 of type `float`. If a provided field cannot be appropriately converted, the processor will still process successfully and leave the field value as-is. In such a case, `target_field` will be updated with the unconverted field value. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/convert-processor.html - ## Example Usage ```terraform @@ -64,4 +76,3 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_csv.md b/docs/data-sources/elasticsearch_ingest_processor_csv.md index b1b0aa307..be515d791 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_csv.md +++ b/docs/data-sources/elasticsearch_ingest_processor_csv.md @@ -1,16 +1,18 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_csv Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_csv Data Source" description: |- - Helper data source to create a processor which extracts fields from CSV line out of a single text field within a document. + Helper data source which can be used to create the configuration for a CSV processor. This processor extracts fields from CSV line out of a single text field within a document. Any empty field in CSV will be skipped. See the CSV processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/csv-processor.html for more details. + If the trim option is enabled then any whitespace in the beginning and in the end of each unquoted field will be trimmed. For example with configuration above, a value of A, B will result in field field2 having value {nbsp}B (with space at the beginning). If trim is enabled A, B will result in field field2 having value B (no whitespace). Quoted fields will be left untouched. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_csv +# elasticstack_elasticsearch_ingest_processor_csv (Data Source) -Helper data source to which can be used to extract fields from CSV line out of a single text field within a document. Any empty field in CSV will be skipped. +Helper data source which can be used to create the configuration for a CSV processor. This processor extracts fields from CSV line out of a single text field within a document. Any empty field in CSV will be skipped. See the [CSV processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/csv-processor.html) for more details. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/csv-processor.html +If the `trim` option is enabled then any whitespace in the beginning and in the end of each unquoted field will be trimmed. For example with configuration above, a value of A, B will result in field field2 having value {nbsp}B (with space at the beginning). If trim is enabled A, B will result in field field2 having value B (no whitespace). Quoted fields will be left untouched. ## Example Usage @@ -33,8 +35,6 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { } ``` -If the `trim` option is enabled then any whitespace in the beginning and in the end of each unquoted field will be trimmed. For example with configuration above, a value of A, B will result in field field2 having value {nbsp}B (with space at the beginning). If trim is enabled A, B will result in field field2 having value B (no whitespace). Quoted fields will be left untouched. - ## Schema @@ -60,4 +60,3 @@ If the `trim` option is enabled then any whitespace in the beginning and in the - `id` (String) Internal identifier of the resource - `json` (String) JSON representation of this data source. - diff --git a/docs/data-sources/elasticsearch_ingest_processor_date.md b/docs/data-sources/elasticsearch_ingest_processor_date.md index 69d002d8a..14e7a8a31 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_date.md +++ b/docs/data-sources/elasticsearch_ingest_processor_date.md @@ -1,22 +1,21 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_date Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_date Data Source" description: |- - Helper data source to create a processor which parses dates from fields, and then uses the date or timestamp as the timestamp for the document. + Helper data source which can be used to create the configuration for a date processor. This processor parses dates from fields, and then uses the date or timestamp as the timestamp for the document. See the date processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/date-processor.html for more details. + By default, the date processor adds the parsed date as a new field called @timestamp. You can specify a different field by setting the target_field configuration parameter. Multiple date formats are supported as part of the same date processor definition. They will be used sequentially to attempt parsing the date field, in the same order they were defined as part of the processor definition. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_date +# elasticstack_elasticsearch_ingest_processor_date (Data Source) -Helper data source to which can be used to parse dates from fields, and then uses the date or timestamp as the timestamp for the document. -By default, the date processor adds the parsed date as a new field called `@timestamp`. You can specify a different field by setting the `target_field` configuration parameter. Multiple date formats are supported as part of the same date processor definition. They will be used sequentially to attempt parsing the date field, in the same order they were defined as part of the processor definition. +Helper data source which can be used to create the configuration for a date processor. This processor parses dates from fields, and then uses the date or timestamp as the timestamp for the document. See the [date processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/date-processor.html) for more details. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/date-processor.html +By default, the date processor adds the parsed date as a new field called `@timestamp`. You can specify a different field by setting the `target_field` configuration parameter. Multiple date formats are supported as part of the same date processor definition. They will be used sequentially to attempt parsing the date field, in the same order they were defined as part of the processor definition. ## Example Usage -Here is an example that adds the parsed date to the `timestamp` field based on the `initial_date` field: - ```terraform provider "elasticstack" { elasticsearch {} diff --git a/docs/data-sources/elasticsearch_ingest_processor_date_index_name.md b/docs/data-sources/elasticsearch_ingest_processor_date_index_name.md index 64580f566..409c639b8 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_date_index_name.md +++ b/docs/data-sources/elasticsearch_ingest_processor_date_index_name.md @@ -1,21 +1,22 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_date_index_name Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_date_index_name Data Source" description: |- - Helper data source to create a processor which helps to point documents to the right time based index based on a date or timestamp field in a document by using the date math index name support. + Helper data source which can be used to create the configuration for a date index name processor. The purpose of this processor is to point documents to the right time based index based on a date or timestamp field in a document by using the date math index name support. See the date index name processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/date-index-name-processor.html for more details. + The processor sets the _index metadata field with a date math index name expression based on the provided index name prefix, a date or timestamp field in the documents being processed and the provided date rounding. + First, this processor fetches the date or timestamp from a field in the document being processed. Optionally, date formatting can be configured on how the field’s value should be parsed into a date. Then this date, the provided index name prefix and the provided date rounding get formatted into a date math index name expression. Also here optionally date formatting can be specified on how the date should be formatted into a date math index name expression. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_date_index_name +# elasticstack_elasticsearch_ingest_processor_date_index_name (Data Source) -The purpose of this processor is to point documents to the right time based index based on a date or timestamp field in a document by using the date math index name support. +Helper data source which can be used to create the configuration for a date index name processor. The purpose of this processor is to point documents to the right time based index based on a date or timestamp field in a document by using the date math index name support. See the [date index name processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/date-index-name-processor.html) for more details. The processor sets the _index metadata field with a date math index name expression based on the provided index name prefix, a date or timestamp field in the documents being processed and the provided date rounding. First, this processor fetches the date or timestamp from a field in the document being processed. Optionally, date formatting can be configured on how the field’s value should be parsed into a date. Then this date, the provided index name prefix and the provided date rounding get formatted into a date math index name expression. Also here optionally date formatting can be specified on how the date should be formatted into a date math index name expression. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/date-index-name-processor.html - ## Example Usage ```terraform diff --git a/docs/data-sources/elasticsearch_ingest_processor_dissect.md b/docs/data-sources/elasticsearch_ingest_processor_dissect.md index 4f8bf46eb..5754da6df 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_dissect.md +++ b/docs/data-sources/elasticsearch_ingest_processor_dissect.md @@ -1,20 +1,22 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_dissect Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_dissect Data Source" description: |- - Helper data source to create a processor which extracts structured fields out of a single text field within a document. + Helper data source which can be used to create the configuration for a dissect processor. This processor extracts structured fields out of a single text field within a document. See the dissect processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/dissect-processor.html#dissect-processor for more details. + Similar to the Grok Processor, dissect also extracts structured fields out of a single text field within a document. However unlike the Grok Processor, dissect does not use Regular Expressions. This allows dissect’s syntax to be simple and for some cases faster than the Grok Processor. + Dissect matches a single text field against a defined pattern. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_dissect +# elasticstack_elasticsearch_ingest_processor_dissect (Data Source) + +Helper data source which can be used to create the configuration for a dissect processor. This processor extracts structured fields out of a single text field within a document. See the [dissect processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/dissect-processor.html#dissect-processor) for more details. Similar to the Grok Processor, dissect also extracts structured fields out of a single text field within a document. However unlike the Grok Processor, dissect does not use Regular Expressions. This allows dissect’s syntax to be simple and for some cases faster than the Grok Processor. Dissect matches a single text field against a defined pattern. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/dissect-processor.html - ## Example Usage ```terraform diff --git a/docs/data-sources/elasticsearch_ingest_processor_dot_expander.md b/docs/data-sources/elasticsearch_ingest_processor_dot_expander.md index 6c0743fae..0f5d11fc0 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_dot_expander.md +++ b/docs/data-sources/elasticsearch_ingest_processor_dot_expander.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_dot_expander Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_dot_expander Data Source" description: |- - Helper data source to create a processor which expands a field with dots into an object field. + Helper data source which can be used to create the configuration for a dot expander processor. This processor expands a field with dots into an object field. See the dot expand processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/dot-expand-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_dot_expander - -Expands a field with dots into an object field. This processor allows fields with dots in the name to be accessible by other processors in the pipeline. Otherwise these fields can’t be accessed by any processor. - -See: elastic.co/guide/en/elasticsearch/reference/current/dot-expand-processor.html +# elasticstack_elasticsearch_ingest_processor_dot_expander (Data Source) +Helper data source which can be used to create the configuration for a dot expander processor. This processor expands a field with dots into an object field. See the [dot expand processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/dot-expand-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_drop.md b/docs/data-sources/elasticsearch_ingest_processor_drop.md index cb7ebd9f8..c677749b7 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_drop.md +++ b/docs/data-sources/elasticsearch_ingest_processor_drop.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_drop Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_drop Data Source" description: |- - Helper data source to create a processor which drops the document without raising any errors. + Helper data source which can be used to create the configuration for a drop processor. This processor drops the document without raising any errors. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/drop-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_drop - -Drops the document without raising any errors. This is useful to prevent the document from getting indexed based on some condition. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/drop-processor.html +# elasticstack_elasticsearch_ingest_processor_drop (Data Source) +Helper data source which can be used to create the configuration for a drop processor. This processor drops the document without raising any errors. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/drop-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_enrich.md b/docs/data-sources/elasticsearch_ingest_processor_enrich.md index b1f66e565..4e4c61d14 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_enrich.md +++ b/docs/data-sources/elasticsearch_ingest_processor_enrich.md @@ -1,16 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_enrich Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_enrich Data Source" description: |- - Helper data source to create a processor which enriches documents with data from another index. + Helper data source which can be used to create the configuration for an enrich processor. The enrich processor can enrich documents with data from another index. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_enrich - -The enrich processor can enrich documents with data from another index. See enrich data section for more information about how to set this up. +# elasticstack_elasticsearch_ingest_processor_enrich (Data Source) -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-enriching-data.html and https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-processor.html +Helper data source which can be used to create the configuration for an enrich processor. The enrich processor can enrich documents with data from another index. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_fail.md b/docs/data-sources/elasticsearch_ingest_processor_fail.md index 3ae3b778b..3ebc14892 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_fail.md +++ b/docs/data-sources/elasticsearch_ingest_processor_fail.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_fail Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_fail Data Source" description: |- - Helper data source to create a processor which raises an exception. + Helper data source which can be used to create the configuration for a fail processor. This processor raises an exception. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/fail-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_fail - -Raises an exception. This is useful for when you expect a pipeline to fail and want to relay a specific message to the requester. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/fail-processor.html +# elasticstack_elasticsearch_ingest_processor_fail (Data Source) +Helper data source which can be used to create the configuration for a fail processor. This processor raises an exception. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/fail-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_fingerprint.md b/docs/data-sources/elasticsearch_ingest_processor_fingerprint.md index f852e051f..264b60787 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_fingerprint.md +++ b/docs/data-sources/elasticsearch_ingest_processor_fingerprint.md @@ -1,16 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_fingerprint Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_fingerprint Data Source" description: |- - Helper data source to create a processor which computes a hash of the document’s content. + Helper data source which can be used to create the configuration for a fingerprint processor. This processor computes a hash of the document’s content. See the fingerprint processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/fingerprint-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_fingerprint - -Computes a hash of the document’s content. You can use this hash for content fingerprinting. +# elasticstack_elasticsearch_ingest_processor_fingerprint (Data Source) -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/fingerprint-processor.html +Helper data source which can be used to create the configuration for a fingerprint processor. This processor computes a hash of the document’s content. See the [fingerprint processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/fingerprint-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_foreach.md b/docs/data-sources/elasticsearch_ingest_processor_foreach.md index a448a7234..32aea4bdd 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_foreach.md +++ b/docs/data-sources/elasticsearch_ingest_processor_foreach.md @@ -1,22 +1,26 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_foreach Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_foreach Data Source" description: |- - Helper data source to create a processor which runs an ingest processor on each element of an array or object. + Helper data source which can be used to create the configuration for a foreach processor. This processor runs an ingest processor on each element of an array or object. See the foreach processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/foreach-processor.html for more details. + All ingest processors can run on array or object elements. However, if the number of elements is unknown, it can be cumbersome to process each one in the same way. + The foreach processor lets you specify a field containing array or object values and a processor to run on each element in the field. + Access keys and values + When iterating through an array or object, the foreach processor stores the current element’s value in the _ingest._value ingest metadata field. _ingest._value contains the entire element value, including any child fields. You can access child field values using dot notation on the _ingest._value field. + When iterating through an object, the foreach processor also stores the current element’s key as a string in _ingest._key. + You can access and change _ingest._key and _ingest._value in the processor. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_foreach +# elasticstack_elasticsearch_ingest_processor_foreach (Data Source) -Runs an ingest processor on each element of an array or object. +Helper data source which can be used to create the configuration for a foreach processor. This processor runs an ingest processor on each element of an array or object. See the [foreach processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/foreach-processor.html) for more details. All ingest processors can run on array or object elements. However, if the number of elements is unknown, it can be cumbersome to process each one in the same way. The `foreach` processor lets you specify a `field` containing array or object values and a `processor` to run on each element in the field. -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/foreach-processor.html - - ### Access keys and values When iterating through an array or object, the foreach processor stores the current element’s value in the `_ingest._value` ingest metadata field. `_ingest._value` contains the entire element value, including any child fields. You can access child field values using dot notation on the `_ingest._value` field. @@ -25,8 +29,6 @@ When iterating through an object, the foreach processor also stores the current You can access and change `_ingest._key` and `_ingest._value` in the processor. - - ## Example Usage ```terraform diff --git a/docs/data-sources/elasticsearch_ingest_processor_geoip.md b/docs/data-sources/elasticsearch_ingest_processor_geoip.md index efec6890d..56cbd111c 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_geoip.md +++ b/docs/data-sources/elasticsearch_ingest_processor_geoip.md @@ -1,24 +1,24 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_geoip Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_geoip Data Source" description: |- - Helper data source to create a processor which adds information about the geographical location of an IPv4 or IPv6 address. + Helper data source which can be used to create the configuration for a geoip processor. The geoip processor adds information about the geographical location of an IPv4 or IPv6 address. See the geoip processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/geoip-processor.html for more details. + By default, the processor uses the GeoLite2 City, GeoLite2 Country, and GeoLite2 ASN GeoIP2 databases from MaxMind, shared under the CC BY-SA 4.0 license. Elasticsearch automatically downloads updates for these databases from the Elastic GeoIP endpoint: https://geoip.elastic.co/v1/database. To get download statistics for these updates, use the GeoIP stats API. + If your cluster can’t connect to the Elastic GeoIP endpoint or you want to manage your own updates, see Manage your own GeoIP2 database updates https://www.elastic.co/guide/en/elasticsearch/reference/current/geoip-processor.html#manage-geoip-database-updates. + If Elasticsearch can’t connect to the endpoint for 30 days all updated databases will become invalid. Elasticsearch will stop enriching documents with geoip data and will add tags: ["_geoip_expired_database"] field instead. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_geoip +# elasticstack_elasticsearch_ingest_processor_geoip (Data Source) -The geoip processor adds information about the geographical location of an IPv4 or IPv6 address. +Helper data source which can be used to create the configuration for a geoip processor. The geoip processor adds information about the geographical location of an IPv4 or IPv6 address. See the [geoip processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/geoip-processor.html) for more details. By default, the processor uses the GeoLite2 City, GeoLite2 Country, and GeoLite2 ASN GeoIP2 databases from MaxMind, shared under the CC BY-SA 4.0 license. Elasticsearch automatically downloads updates for these databases from the Elastic GeoIP endpoint: https://geoip.elastic.co/v1/database. To get download statistics for these updates, use the GeoIP stats API. If your cluster can’t connect to the Elastic GeoIP endpoint or you want to manage your own updates, [see Manage your own GeoIP2 database updates](https://www.elastic.co/guide/en/elasticsearch/reference/current/geoip-processor.html#manage-geoip-database-updates). -If Elasticsearch can’t connect to the endpoint for 30 days all updated databases will become invalid. Elasticsearch will stop enriching documents with geoip data and will add tags: ["_geoip_expired_database"] field instead. - - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/geoip-processor.html - +If Elasticsearch can’t connect to the endpoint for 30 days all updated databases will become invalid. Elasticsearch will stop enriching documents with geoip data and will add `tags: ["_geoip_expired_database"]` field instead. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_grok.md b/docs/data-sources/elasticsearch_ingest_processor_grok.md index 9a078fc26..4a002a33c 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_grok.md +++ b/docs/data-sources/elasticsearch_ingest_processor_grok.md @@ -1,23 +1,22 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_grok Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_grok Data Source" description: |- - Helper data source to create a processor which extracts structured fields out of a single text field within a document. + Helper data source which can be used to create the configuration for a grok processor. This processor extracts structured fields out of a single text field within a document. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/grok-processor.html + This processor comes packaged with many reusable patterns https://github.com/elastic/elasticsearch/blob/master/libs/grok/src/main/resources/patterns. + If you need help building patterns to match your logs, you will find the Grok Debugger https://www.elastic.co/guide/en/kibana/master/xpack-grokdebugger.html tool quite useful! The Grok Constructor https://grokconstructor.appspot.com/ is also a useful tool. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_grok +# elasticstack_elasticsearch_ingest_processor_grok (Data Source) -Extracts structured fields out of a single text field within a document. You choose which field to extract matched fields from, as well as the grok pattern you expect will match. A grok pattern is like a regular expression that supports aliased expressions that can be reused. +Helper data source which can be used to create the configuration for a grok processor. This processor extracts structured fields out of a single text field within a document. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/grok-processor.html This processor comes packaged with many [reusable patterns](https://github.com/elastic/elasticsearch/blob/master/libs/grok/src/main/resources/patterns). If you need help building patterns to match your logs, you will find the [Grok Debugger](https://www.elastic.co/guide/en/kibana/master/xpack-grokdebugger.html) tool quite useful! [The Grok Constructor](https://grokconstructor.appspot.com/) is also a useful tool. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/grok-processor.html - - ## Example Usage ```terraform diff --git a/docs/data-sources/elasticsearch_ingest_processor_gsub.md b/docs/data-sources/elasticsearch_ingest_processor_gsub.md index 3798599c7..e75bb2f4a 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_gsub.md +++ b/docs/data-sources/elasticsearch_ingest_processor_gsub.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_gsub Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_gsub Data Source" description: |- - Helper data source to create a processor which converts a string field by applying a regular expression and a replacement. + Helper data source which can be used to create the configuration for a gsub processor. This processor converts a string field by applying a regular expression and a replacement. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/gsub-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_gsub - -Converts a string field by applying a regular expression and a replacement. If the field is an array of string, all members of the array will be converted. If any non-string values are encountered, the processor will throw an exception. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/gsub-processor.html +# elasticstack_elasticsearch_ingest_processor_gsub (Data Source) +Helper data source which can be used to create the configuration for a gsub processor. This processor converts a string field by applying a regular expression and a replacement. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/gsub-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_html_strip.md b/docs/data-sources/elasticsearch_ingest_processor_html_strip.md index ba34acda0..171b76ed8 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_html_strip.md +++ b/docs/data-sources/elasticsearch_ingest_processor_html_strip.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_html_strip Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_html_strip Data Source" description: |- - Helper data source to create a processor which removes HTML tags from the field. + Helper data source which can be used to create the configuration for an HTML strip processor. This processor removes HTML tags from the field. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/htmlstrip-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_html_strip - -Removes HTML tags from the field. If the field is an array of strings, HTML tags will be removed from all members of the array. - -See: templates/data-sources/elasticsearch_ingest_processor_html_strip.md.tmpl +# elasticstack_elasticsearch_ingest_processor_html_strip (Data Source) +Helper data source which can be used to create the configuration for an HTML strip processor. This processor removes HTML tags from the field. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/htmlstrip-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_join.md b/docs/data-sources/elasticsearch_ingest_processor_join.md index 866178a67..a46139936 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_join.md +++ b/docs/data-sources/elasticsearch_ingest_processor_join.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_join Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_join Data Source" description: |- - Helper data source to create a processor which joins each element of an array into a single string using a separator character between each element. + Helper data source which can be used to create the configuration for a join processor. This processor joins each element of an array into a single string using a separator character between each element. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/join-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_join - -Joins each element of an array into a single string using a separator character between each element. Throws an error when the field is not an array. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/join-processor.html +# elasticstack_elasticsearch_ingest_processor_join (Data Source) +Helper data source which can be used to create the configuration for a join processor. This processor joins each element of an array into a single string using a separator character between each element. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/join-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_json.md b/docs/data-sources/elasticsearch_ingest_processor_json.md index f7b3d3c5a..e2a1e8989 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_json.md +++ b/docs/data-sources/elasticsearch_ingest_processor_json.md @@ -1,16 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_json Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_json Data Source" description: |- - Helper data source to create a processor which converts a JSON string into a structured JSON object. + Helper data source which can be used to create the configuration for a JSON processor. This processor converts a JSON string into a structured JSON object. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/json-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_json - -Converts a JSON string into a structured JSON object. +# elasticstack_elasticsearch_ingest_processor_json (Data Source) -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/json-processor.html +Helper data source which can be used to create the configuration for a JSON processor. This processor converts a JSON string into a structured JSON object. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/json-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_kv.md b/docs/data-sources/elasticsearch_ingest_processor_kv.md index 7dc000a0a..c3c901e98 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_kv.md +++ b/docs/data-sources/elasticsearch_ingest_processor_kv.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_kv Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_kv Data Source" description: |- - Helper data source to create a processor which helps automatically parse messages (or specific event fields) which are of the `foo=bar` variety. + Helper data source which can be used to create the configuration for a KV processor. This processor helps automatically parse messages (or specific event fields) which are of the foo=bar variety. See the KV processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/kv-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_kv - -This processor helps automatically parse messages (or specific event fields) which are of the `foo=bar` variety. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/kv-processor.html +# elasticstack_elasticsearch_ingest_processor_kv (Data Source) +Helper data source which can be used to create the configuration for a KV processor. This processor helps automatically parse messages (or specific event fields) which are of the foo=bar variety. See the [KV processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/kv-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_lowercase.md b/docs/data-sources/elasticsearch_ingest_processor_lowercase.md index b8f6c903b..0e030f1cd 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_lowercase.md +++ b/docs/data-sources/elasticsearch_ingest_processor_lowercase.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_lowercase Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_lowercase Data Source" description: |- - Helper data source to create a processor which converts a string to its lowercase equivalent. + Helper data source which can be used to create the configuration for a lowercase processor. This processor converts a string to its lowercase equivalent. If the field is an array of strings, all members of the array will be converted. See the lowercase processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/lowercase-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_lowercase - -Converts a string to its lowercase equivalent. If the field is an array of strings, all members of the array will be converted. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/lowercase-processor.html +# elasticstack_elasticsearch_ingest_processor_lowercase (Data Source) +Helper data source which can be used to create the configuration for a lowercase processor. This processor converts a string to its lowercase equivalent. If the field is an array of strings, all members of the array will be converted. See the [lowercase processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/lowercase-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_network_direction.md b/docs/data-sources/elasticsearch_ingest_processor_network_direction.md index 7ab772a41..956158833 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_network_direction.md +++ b/docs/data-sources/elasticsearch_ingest_processor_network_direction.md @@ -1,21 +1,26 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_network_direction Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_network_direction Data Source" description: |- - Helper data source to create a processor which calculates the network direction given a source IP address, destination IP address, and a list of internal networks. + Helper data source which can be used to create the configuration for a network direction processor. This processor calculates the network direction given a source IP address, destination IP address, and a list of internal networks. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/network-direction-processor.html + The network direction processor reads IP addresses from Elastic Common Schema (ECS) fields by default. If you use the ECS, only the internal_networks option must be specified. + One of either internal_networks or internal_networks_field must be specified. If internal_networks_field is specified, it follows the behavior specified by ignore_missing. + Supported named network ranges + The named ranges supported for the internal_networks option are: + loopback - Matches loopback addresses in the range of 127.0.0.0/8 or ::1/128.unicast or global_unicast - Matches global unicast addresses defined in RFC 1122, RFC 4632, and RFC 4291 with the exception of the IPv4 broadcast address (255.255.255.255). This includes private address ranges.multicast - Matches multicast addresses.interface_local_multicast - Matches IPv6 interface-local multicast addresses.link_local_unicast - Matches link-local unicast addresses.link_local_multicast - Matches link-local multicast addresses.private - Matches private address ranges defined in RFC 1918 (IPv4) and RFC 4193 (IPv6).public - Matches addresses that are not loopback, unspecified, IPv4 broadcast, link local unicast, link local multicast, interface local multicast, or private.unspecified - Matches unspecified addresses (either the IPv4 address "0.0.0.0" or the IPv6 address "::"). --- -# Data Source: elasticstack_elasticsearch_ingest_processor_network_direction +# elasticstack_elasticsearch_ingest_processor_network_direction (Data Source) -Calculates the network direction given a source IP address, destination IP address, and a list of internal networks. +Helper data source which can be used to create the configuration for a network direction processor. This processor calculates the network direction given a source IP address, destination IP address, and a list of internal networks. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/network-direction-processor.html The network direction processor reads IP addresses from Elastic Common Schema (ECS) fields by default. If you use the ECS, only the `internal_networks` option must be specified. - One of either `internal_networks` or `internal_networks_field` must be specified. If `internal_networks_field` is specified, it follows the behavior specified by `ignore_missing`. -### Supported named network rangese +### Supported named network ranges The named ranges supported for the internal_networks option are: @@ -29,10 +34,6 @@ The named ranges supported for the internal_networks option are: * `public` - Matches addresses that are not loopback, unspecified, IPv4 broadcast, link local unicast, link local multicast, interface local multicast, or private. * `unspecified` - Matches unspecified addresses (either the IPv4 address "0.0.0.0" or the IPv6 address "::"). - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/network-direction-processor.html - - ## Example Usage ```terraform diff --git a/docs/data-sources/elasticsearch_ingest_processor_pipeline.md b/docs/data-sources/elasticsearch_ingest_processor_pipeline.md index 4374db6ce..8cf90edd5 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_pipeline.md +++ b/docs/data-sources/elasticsearch_ingest_processor_pipeline.md @@ -1,19 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_pipeline Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_pipeline Data Source" description: |- - Helper data source to create a processor which executes another pipeline. + Helper data source which can be used to create the configuration for a pipeline processor. This processor executes another pipeline. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/pipeline-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_pipeline - -Executes another pipeline. - -The name of the current pipeline can be accessed from the `_ingest.pipeline` ingest metadata key. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/pipeline-processor.html +# elasticstack_elasticsearch_ingest_processor_pipeline (Data Source) +Helper data source which can be used to create the configuration for a pipeline processor. This processor executes another pipeline. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/pipeline-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_registered_domain.md b/docs/data-sources/elasticsearch_ingest_processor_registered_domain.md index 215a0be3a..0641c9983 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_registered_domain.md +++ b/docs/data-sources/elasticsearch_ingest_processor_registered_domain.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_registered_domain Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_registered_domain Data Source" description: |- - Helper data source to create a processor which Extracts the registered domain, sub-domain, and top-level domain from a fully qualified domain name. + Helper data source which can be used to create the configuration for a registered domain processor. This processor extracts the registered domain (also known as the effective top-level domain or eTLD), sub-domain, and top-level domain from a fully qualified domain name (FQDN). See: https://www.elastic.co/guide/en/elasticsearch/reference/current/registered-domain-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_registered_domain - -Extracts the registered domain (also known as the effective top-level domain or eTLD), sub-domain, and top-level domain from a fully qualified domain name (FQDN). Uses the registered domains defined in the Mozilla Public Suffix List. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/registered-domain-processor.html +# elasticstack_elasticsearch_ingest_processor_registered_domain (Data Source) +Helper data source which can be used to create the configuration for a registered domain processor. This processor extracts the registered domain (also known as the effective top-level domain or eTLD), sub-domain, and top-level domain from a fully qualified domain name (FQDN). See: https://www.elastic.co/guide/en/elasticsearch/reference/current/registered-domain-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_remove.md b/docs/data-sources/elasticsearch_ingest_processor_remove.md index 5a5a1984c..d1db4f7e3 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_remove.md +++ b/docs/data-sources/elasticsearch_ingest_processor_remove.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_remove Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_remove Data Source" description: |- - Helper data source to create a processor which removes existing fields. + Helper data source which can be used to create the configuration for a remove processor. This processor removes existing fields. See the remove processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/remove-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_remove - -Removes existing fields. If one field doesn’t exist, an exception will be thrown. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/remove-processor.html +# elasticstack_elasticsearch_ingest_processor_remove (Data Source) +Helper data source which can be used to create the configuration for a remove processor. This processor removes existing fields. See the [remove processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/remove-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_rename.md b/docs/data-sources/elasticsearch_ingest_processor_rename.md index f1268b4f5..57effd437 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_rename.md +++ b/docs/data-sources/elasticsearch_ingest_processor_rename.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_rename Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_rename Data Source" description: |- - Helper data source to create a processor which renames an existing field. + Helper data source which can be used to create the configuration for a rename processor. This processor renames an existing field. See the rename processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/rename-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_rename - -Renames an existing field. If the field doesn’t exist or the new name is already used, an exception will be thrown. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/rename-processor.html +# elasticstack_elasticsearch_ingest_processor_rename (Data Source) +Helper data source which can be used to create the configuration for a rename processor. This processor renames an existing field. See the [rename processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/rename-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_reroute.md b/docs/data-sources/elasticsearch_ingest_processor_reroute.md index bcb5a9bb0..32b763744 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_reroute.md +++ b/docs/data-sources/elasticsearch_ingest_processor_reroute.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_reroute Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_reroute Data Source" description: |- - Helper data source to create a processor which reroutes a document to a different data stream, index, or index alias. + Helper data source which can be used to create the configuration for a reroute processor. This processor reroutes a document to a different data stream, index, or index alias. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/reroute-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_reroute - -Reroutes a document to a different data stream, index, or index alias. This processor is useful for routing documents based on data stream routing rules. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/reroute-processor.html +# elasticstack_elasticsearch_ingest_processor_reroute (Data Source) +Helper data source which can be used to create the configuration for a reroute processor. This processor reroutes a document to a different data stream, index, or index alias. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/reroute-processor.html ## Example Usage @@ -52,4 +50,4 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { ### Read-Only - `id` (String) Internal identifier of the resource. -- `json` (String) JSON representation of this data source. \ No newline at end of file +- `json` (String) JSON representation of this data source. diff --git a/docs/data-sources/elasticsearch_ingest_processor_script.md b/docs/data-sources/elasticsearch_ingest_processor_script.md index b52d3cb41..fda8e951d 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_script.md +++ b/docs/data-sources/elasticsearch_ingest_processor_script.md @@ -1,27 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_script Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_script Data Source" description: |- - Helper data source to create a processor which runs an inline or stored script on incoming documents. + Helper data source which can be used to create the configuration for a script processor. This processor runs an inline or stored script on incoming documents. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/script-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_script - -Runs an inline or stored script on incoming documents. The script runs in the ingest context. - -The script processor uses the script cache to avoid recompiling the script for each incoming document. To improve performance, ensure the script cache is properly sized before using a script processor in production. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/script-processor.html - -### Access source fields - -The script processor parses each incoming document’s JSON source fields into a set of maps, lists, and primitives. To access these fields with a Painless script, use the map access operator: `ctx['my-field']`. You can also use the shorthand `ctx.` syntax. - -### Access metadata fields - -You can also use a script processor to access metadata fields. +# elasticstack_elasticsearch_ingest_processor_script (Data Source) +Helper data source which can be used to create the configuration for a script processor. This processor runs an inline or stored script on incoming documents. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/script-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_set.md b/docs/data-sources/elasticsearch_ingest_processor_set.md index 6eeab00e1..bf6b65838 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_set.md +++ b/docs/data-sources/elasticsearch_ingest_processor_set.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_set Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_set Data Source" description: |- - Helper data source to create a processor which sets one field and associates it with the specified value. + Helper data source which can be used to create the configuration for a set processor. This processor sets one field and associates it with the specified value. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/set-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_set - -Sets one field and associates it with the specified value. If the field already exists, its value will be replaced with the provided one. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/set-processor.html +# elasticstack_elasticsearch_ingest_processor_set (Data Source) +Helper data source which can be used to create the configuration for a set processor. This processor sets one field and associates it with the specified value. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/set-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_set_security_user.md b/docs/data-sources/elasticsearch_ingest_processor_set_security_user.md index 62249a9f0..2586e6470 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_set_security_user.md +++ b/docs/data-sources/elasticsearch_ingest_processor_set_security_user.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_set_security_user Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_set_security_user Data Source" description: |- - Helper data source to create a processor which sets user-related details from the current authenticated user to the current document by pre-processing the ingest. + Helper data source which can be used to create the configuration for a set security user processor. This processor sets user-related details (such as username, roles, email, full_name, metadata, api_key, realm and authentication_type) from the current authenticated user to the current document by pre-processing the ingest. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-node-set-security-user-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_set_security_user - -Sets user-related details (such as `username`, `roles`, `email`, `full_name`, `metadata`, `api_key`, `realm` and `authentication_typ`e) from the current authenticated user to the current document by pre-processing the ingest. The `api_key` property exists only if the user authenticates with an API key. It is an object containing the id, name and metadata (if it exists and is non-empty) fields of the API key. The realm property is also an object with two fields, name and type. When using API key authentication, the realm property refers to the realm from which the API key is created. The `authentication_type property` is a string that can take value from `REALM`, `API_KEY`, `TOKEN` and `ANONYMOUS`. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-node-set-security-user-processor.html +# elasticstack_elasticsearch_ingest_processor_set_security_user (Data Source) +Helper data source which can be used to create the configuration for a set security user processor. This processor sets user-related details (such as username, roles, email, full_name, metadata, api_key, realm and authentication_type) from the current authenticated user to the current document by pre-processing the ingest. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-node-set-security-user-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_sort.md b/docs/data-sources/elasticsearch_ingest_processor_sort.md index c4c240503..a565104b4 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_sort.md +++ b/docs/data-sources/elasticsearch_ingest_processor_sort.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_sort Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_sort Data Source" description: |- - Helper data source to create a processor which sorts the elements of an array ascending or descending. + Helper data source which can be used to create the configuration for a sort processor. This processor sorts the elements of an array ascending or descending. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/sort-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_sort - -Sorts the elements of an array ascending or descending. Homogeneous arrays of numbers will be sorted numerically, while arrays of strings or heterogeneous arrays of strings + numbers will be sorted lexicographically. Throws an error when the field is not an array. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/sort-processor.html +# elasticstack_elasticsearch_ingest_processor_sort (Data Source) +Helper data source which can be used to create the configuration for a sort processor. This processor sorts the elements of an array ascending or descending. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/sort-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_split.md b/docs/data-sources/elasticsearch_ingest_processor_split.md index d8f318509..c95f3b878 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_split.md +++ b/docs/data-sources/elasticsearch_ingest_processor_split.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_split Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_split Data Source" description: |- - Helper data source to create a processor which splits a field into an array using a separator character. + Helper data source which can be used to create the configuration for a split processor. This processor splits a field into an array using a separator character. See the split processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/split-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_split - -Splits a field into an array using a separator character. Only works on string fields. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/split-processor.html +# elasticstack_elasticsearch_ingest_processor_split (Data Source) +Helper data source which can be used to create the configuration for a split processor. This processor splits a field into an array using a separator character. See the [split processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/split-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_trim.md b/docs/data-sources/elasticsearch_ingest_processor_trim.md index 4f230cff9..224fe5b99 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_trim.md +++ b/docs/data-sources/elasticsearch_ingest_processor_trim.md @@ -1,19 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_trim Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_trim Data Source" description: |- - Helper data source to create a processor which trims whitespace from field. + Helper data source which can be used to create the configuration for a trim processor. This processor trims whitespace from field. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/trim-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_trim - -Trims whitespace from field. If the field is an array of strings, all members of the array will be trimmed. - -**NOTE:** This only works on leading and trailing whitespace. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/trim-processor.html +# elasticstack_elasticsearch_ingest_processor_trim (Data Source) +Helper data source which can be used to create the configuration for a trim processor. This processor trims whitespace from field. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/trim-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_uppercase.md b/docs/data-sources/elasticsearch_ingest_processor_uppercase.md index 6954ed14c..21b3c7fb9 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_uppercase.md +++ b/docs/data-sources/elasticsearch_ingest_processor_uppercase.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_uppercase Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_uppercase Data Source" description: |- - Helper data source to create a processor which converts a string to its uppercase equivalent. + Helper data source which can be used to create the configuration for an uppercase processor. This processor converts a string to its uppercase equivalent. If the field is an array of strings, all members of the array will be converted. See the uppercase processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/uppercase-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_uppercase - -Converts a string to its uppercase equivalent. If the field is an array of strings, all members of the array will be converted. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/uppercase-processor.html +# elasticstack_elasticsearch_ingest_processor_uppercase (Data Source) +Helper data source which can be used to create the configuration for an uppercase processor. This processor converts a string to its uppercase equivalent. If the field is an array of strings, all members of the array will be converted. See the [uppercase processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/uppercase-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_uri_parts.md b/docs/data-sources/elasticsearch_ingest_processor_uri_parts.md index 5867f8baf..2e6db8a25 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_uri_parts.md +++ b/docs/data-sources/elasticsearch_ingest_processor_uri_parts.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_uri_parts Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_uri_parts Data Source" description: |- - Helper data source to create a processor which parses a Uniform Resource Identifier (URI) string and extracts its components as an object. + Helper data source which can be used to create the configuration for a URI parts processor. This processor parses a Uniform Resource Identifier (URI) string and extracts its components as an object. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/uri-parts-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_uri_parts - -Parses a Uniform Resource Identifier (URI) string and extracts its components as an object. This URI object includes properties for the URI’s domain, path, fragment, port, query, scheme, user info, username, and password. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/uri-parts-processor.html +# elasticstack_elasticsearch_ingest_processor_uri_parts (Data Source) +Helper data source which can be used to create the configuration for a URI parts processor. This processor parses a Uniform Resource Identifier (URI) string and extracts its components as an object. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/uri-parts-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_urldecode.md b/docs/data-sources/elasticsearch_ingest_processor_urldecode.md index e8dae0d43..1e32ea90f 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_urldecode.md +++ b/docs/data-sources/elasticsearch_ingest_processor_urldecode.md @@ -1,17 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_urldecode Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_urldecode Data Source" description: |- - Helper data source to create a processor which URL-decodes a string. + Helper data source which can be used to create the configuration for a URL-decode processor. This processor URL-decodes a string. See the URL decode processor documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/urldecode-processor.html for more details. --- -# Data Source: elasticstack_elasticsearch_ingest_processor_urldecode - -URL-decodes a string. If the field is an array of strings, all members of the array will be decoded. - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/urldecode-processor.html +# elasticstack_elasticsearch_ingest_processor_urldecode (Data Source) +Helper data source which can be used to create the configuration for a URL-decode processor. This processor URL-decodes a string. See the [URL decode processor documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/urldecode-processor.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_ingest_processor_user_agent.md b/docs/data-sources/elasticsearch_ingest_processor_user_agent.md index 1c728515b..3d07503f0 100644 --- a/docs/data-sources/elasticsearch_ingest_processor_user_agent.md +++ b/docs/data-sources/elasticsearch_ingest_processor_user_agent.md @@ -1,20 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_processor_user_agent Data Source - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_processor_user_agent Data Source" description: |- - Helper data source to create a processor which extracts details from the user agent string a browser sends with its web requests. + Helper data source which can be used to create the configuration for a user agent processor. This processor extracts details from the user agent string a browser sends with its web requests. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/user-agent-processor.html --- -# Data Source: elasticstack_elasticsearch_ingest_processor_user_agent - -The `user_agent` processor extracts details from the user agent string a browser sends with its web requests. This processor adds this information by default under the `user_agent` field. - -The ingest-user-agent module ships by default with the regexes.yaml made available by uap-java with an Apache 2.0 license. For more details see https://github.com/ua-parser/uap-core. - - -See: https://www.elastic.co/guide/en/elasticsearch/reference/current/user-agent-processor.html +# elasticstack_elasticsearch_ingest_processor_user_agent (Data Source) +Helper data source which can be used to create the configuration for a user agent processor. This processor extracts details from the user agent string a browser sends with its web requests. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/user-agent-processor.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_security_role.md b/docs/data-sources/elasticsearch_security_role.md index f41c364c0..4f50c343b 100644 --- a/docs/data-sources/elasticsearch_security_role.md +++ b/docs/data-sources/elasticsearch_security_role.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_security_role Data Source - terraform-provider-elasticstack" subcategory: "Security" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_security_role Data Source" description: |- - Retrieves roles in the native realm. + Retrieves roles in the native realm. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-role.html --- -# Data Source: elasticstack_elasticsearch_security_role +# elasticstack_elasticsearch_security_role (Data Source) -Use this data source to get information about an existing Elasticsearch role. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-role.html +Retrieves roles in the native realm. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-role.html ## Example Usage diff --git a/docs/data-sources/elasticsearch_security_role_mapping.md b/docs/data-sources/elasticsearch_security_role_mapping.md index ae93d8aaf..83eef6d35 100644 --- a/docs/data-sources/elasticsearch_security_role_mapping.md +++ b/docs/data-sources/elasticsearch_security_role_mapping.md @@ -1,12 +1,13 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_security_role_mapping Data Source - terraform-provider-elasticstack" subcategory: "Security" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_security_role_mapping Data Source" description: |- - Retrieves role mappings. + Retrieves role mappings. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-role-mapping.html --- -# Data Source: elasticstack_elasticsearch_security_role_mapping +# elasticstack_elasticsearch_security_role_mapping (Data Source) Retrieves role mappings. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-role-mapping.html @@ -35,7 +36,7 @@ output "user" { ### Optional -- `elasticsearch_connection` (Block List, Max: 1, Deprecated) Elasticsearch connection configuration block. This property will be removed in a future provider version. Configure the Elasticsearch connection via the provider configuration instead. (see [below for nested schema](#nestedblock--elasticsearch_connection)) +- `elasticsearch_connection` (Block List, Deprecated) Elasticsearch connection configuration block. (see [below for nested schema](#nestedblock--elasticsearch_connection)) ### Read-Only diff --git a/docs/data-sources/elasticsearch_security_user.md b/docs/data-sources/elasticsearch_security_user.md index 6fc2c049c..5d180e67a 100644 --- a/docs/data-sources/elasticsearch_security_user.md +++ b/docs/data-sources/elasticsearch_security_user.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_security_user Data Source - terraform-provider-elasticstack" subcategory: "Security" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_security_user Data Source" description: |- - Gets information about Elasticsearch user. + Get the information about the user in the ES cluster. See the security API get user documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-user.html for more details. --- -# Data Source: elasticstack_elasticsearch_security_user +# elasticstack_elasticsearch_security_user (Data Source) -Use this data source to get information about existing Elasticsearch user. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-user.html". +Get the information about the user in the ES cluster. See the [security API get user documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-get-user.html) for more details. ## Example Usage diff --git a/docs/data-sources/elasticsearch_snapshot_repository.md b/docs/data-sources/elasticsearch_snapshot_repository.md index 7004d0ef4..9987b6e1e 100644 --- a/docs/data-sources/elasticsearch_snapshot_repository.md +++ b/docs/data-sources/elasticsearch_snapshot_repository.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_snapshot_repository Data Source - terraform-provider-elasticstack" subcategory: "Snapshot" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_snapshot_repository Data Source" description: |- Gets information about the registered snapshot repositories. --- -# Data Source: elasticstack_elasticsearch_snapshot_repository +# elasticstack_elasticsearch_snapshot_repository (Data Source) -This data source provides the information about the registered snaphosts repositories +Gets information about the registered snapshot repositories. ## Example Usage diff --git a/docs/data-sources/fleet_enrollment_tokens.md b/docs/data-sources/fleet_enrollment_tokens.md index 215ba3621..bfe6753e9 100644 --- a/docs/data-sources/fleet_enrollment_tokens.md +++ b/docs/data-sources/fleet_enrollment_tokens.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_fleet_enrollment_tokens Data Source - terraform-provider-elasticstack" subcategory: "Fleet" -layout: "" -page_title: "Elasticstack: elasticstack_fleet_enrollment_tokens Data Source" description: |- - Gets information about Fleet Enrollment Tokens. See https://www.elastic.co/guide/en/fleet/current/fleet-enrollment-tokens.html + Retrieves Elasticsearch API keys used to enroll Elastic Agents in Fleet. See the Fleet enrollment tokens documentation https://www.elastic.co/guide/en/fleet/current/fleet-enrollment-tokens.html for more details. --- -# Data Source: elasticstack_fleet_enrollment_tokens +# elasticstack_fleet_enrollment_tokens (Data Source) -This data source provides information about Fleet Enrollment Tokens. +Retrieves Elasticsearch API keys used to enroll Elastic Agents in Fleet. See the [Fleet enrollment tokens documentation](https://www.elastic.co/guide/en/fleet/current/fleet-enrollment-tokens.html) for more details. ## Example Usage diff --git a/docs/data-sources/fleet_integration.md b/docs/data-sources/fleet_integration.md index 4b329781c..9a9493cf1 100644 --- a/docs/data-sources/fleet_integration.md +++ b/docs/data-sources/fleet_integration.md @@ -1,12 +1,21 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_fleet_integration Data Source - terraform-provider-elasticstack" subcategory: "Fleet" -layout: "" -page_title: "Elasticstack: elasticstack_fleet_integration Data Source" description: |- - Gets information about a Fleet integration package. + This data source provides information about a Fleet integration package. Currently, + the data source will retrieve the latest available version of the package. Version + selection is determined by the Fleet API, which is currently based on semantic + versioning. + By default, the highest GA release version will be selected. If a + package is not GA (the version is below 1.0.0) or if a new non-GA version of the + package is to be selected (i.e., the GA version of the package is 1.5.0, but there's + a new 1.5.1-beta version available), then the prerelease parameter in the plan + should be set to true. --- -# Data Source: elasticstack_fleet_integration +# elasticstack_fleet_integration (Data Source) This data source provides information about a Fleet integration package. Currently, the data source will retrieve the latest available version of the package. Version diff --git a/docs/data-sources/kibana_action_connector.md b/docs/data-sources/kibana_action_connector.md index 5513b5d70..ed5b3b505 100644 --- a/docs/data-sources/kibana_action_connector.md +++ b/docs/data-sources/kibana_action_connector.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_kibana_action_connector Data Source - terraform-provider-elasticstack" subcategory: "Kibana" -layout: "" -page_title: "Elasticstack: elasticstack_kibana_action_connector Data Source" description: |- - Retrieve a specific action connector role. See https://www.elastic.co/guide/en/kibana/current/get-all-connectors-api.html. + Search for a connector by name, space id, and type. Note, that this data source will fail if more than one connector shares the same name. --- -# Data Source: elasticstack_kibana_action_connector +# elasticstack_kibana_action_connector (Data Source) -Use this data source to get information about an existing action connector. +Search for a connector by name, space id, and type. Note, that this data source will fail if more than one connector shares the same name. ## Example Usage diff --git a/docs/data-sources/kibana_security_role.md b/docs/data-sources/kibana_security_role.md index f2db9f711..8f145f918 100644 --- a/docs/data-sources/kibana_security_role.md +++ b/docs/data-sources/kibana_security_role.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_kibana_security_role Data Source - terraform-provider-elasticstack" subcategory: "Kibana" -layout: "" -page_title: "Elasticstack: elasticstack_kibana_security_role Data Source" description: |- - Retrieve a specific Kibana role. See https://www.elastic.co/guide/en/kibana/master/role-management-specific-api-get.html + Retrieve a specific role. See the role management API documentation https://www.elastic.co/guide/en/kibana/current/role-management-specific-api-get.html for more details. --- -# Data Source: elasticstack_kibana_security_role +# elasticstack_kibana_security_role (Data Source) -Use this data source to get information about an existing Kibana role. +Retrieve a specific role. See the [role management API documentation](https://www.elastic.co/guide/en/kibana/current/role-management-specific-api-get.html) for more details. ## Example Usage diff --git a/docs/data-sources/kibana_spaces.md b/docs/data-sources/kibana_spaces.md index 4bb1d0120..df1d7fb28 100644 --- a/docs/data-sources/kibana_spaces.md +++ b/docs/data-sources/kibana_spaces.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_kibana_spaces Data Source - terraform-provider-elasticstack" subcategory: "Kibana" -layout: "" -page_title: "Elasticstack: elasticstack_kibana_spaces Data Source" description: |- - Retrieve all Kibana spaces. See https://www.elastic.co/guide/en/kibana/master/spaces-api-get-all.html + Use this data source to retrieve and get information about all existing Kibana spaces. See https://www.elastic.co/guide/en/kibana/master/spaces-api-get-all.html --- -# Data Source: elasticstack_kibana_spaces +# elasticstack_kibana_spaces (Data Source) -Use this data source to retrieve and get information about all existing Kibana spaces. +Use this data source to retrieve and get information about all existing Kibana spaces. See https://www.elastic.co/guide/en/kibana/master/spaces-api-get-all.html ## Example Usage @@ -41,11 +42,12 @@ Required: Optional: - `description` (String) The description for the space. -- `disabled_features` (List of String) The list of disabled features for the space. To get a list of available feature IDs, use the Features API (https://www.elastic.co/guide/en/kibana/master/features-api-get.html). - `image_url` (String) The data-URL encoded image to display in the space avatar. Read-Only: - `color` (String) The hexadecimal color code used in the space avatar. By default, the color is automatically generated from the space name. +- `disabled_features` (List of String) The list of disabled features for the space. To get a list of available feature IDs, use the Features API (https://www.elastic.co/guide/en/kibana/master/features-api-get.html). - `id` (String) Internal identifier of the resource. - `initials` (String) The initials shown in the space avatar. By default, the initials are automatically generated from the space name. Initials must be 1 or 2 characters. +- `solution` (String) The solution view for the space. Valid options are `security`, `oblt`, `es`, or `classic`. diff --git a/docs/resources/apm_agent_configuration.md b/docs/resources/apm_agent_configuration.md index 8df89ceae..df99f512e 100644 --- a/docs/resources/apm_agent_configuration.md +++ b/docs/resources/apm_agent_configuration.md @@ -1,14 +1,15 @@ + --- -subcategory: "Kibana" -layout: "" -page_title: "Elasticstack: elasticstack_apm_agent_configuration Resource" +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_apm_agent_configuration Resource - terraform-provider-elasticstack" +subcategory: "APM" description: |- - Creates or updates an APM agent configuration + Creates or updates an APM agent configuration. See https://www.elastic.co/docs/solutions/observability/apm/apm-agent-central-configuration. --- -# Resource: elasticstack_apm_agent_configuration +# elasticstack_apm_agent_configuration (Resource) -Creates or updates an APM agent configuration. See https://www.elastic.co/docs/solutions/observability/apm/apm-agent-central-configuration +Creates or updates an APM agent configuration. See https://www.elastic.co/docs/solutions/observability/apm/apm-agent-central-configuration. ## Example Usage @@ -49,6 +50,8 @@ resource "elasticstack_apm_agent_configuration" "test_config" { Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_apm_agent_configuration.test_configuration my-service:production ``` diff --git a/docs/resources/elasticsearch_cluster_settings.md b/docs/resources/elasticsearch_cluster_settings.md index bd591363e..8c675c24c 100644 --- a/docs/resources/elasticsearch_cluster_settings.md +++ b/docs/resources/elasticsearch_cluster_settings.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_cluster_settings Resource - terraform-provider-elasticstack" subcategory: "Cluster" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_cluster_settings Resource" description: |- - Updates cluster-wide settings. + Updates cluster-wide settings. If the Elasticsearch security features are enabled, you must have the manage cluster privilege to use this API. See the cluster settings documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/cluster-update-settings.html for more details. --- -# Resource: elasticstack_elasticsearch_cluster_settings +# elasticstack_elasticsearch_cluster_settings (Resource) -Updates cluster-wide settings. If the Elasticsearch security features are enabled, you must have the manage cluster privilege to use this API. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/cluster-update-settings.html +Updates cluster-wide settings. If the Elasticsearch security features are enabled, you must have the manage cluster privilege to use this API. See the [cluster settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/cluster-update-settings.html) for more details. ## Example Usage diff --git a/docs/resources/elasticsearch_component_template.md b/docs/resources/elasticsearch_component_template.md index 441232654..c835fa40d 100644 --- a/docs/resources/elasticsearch_component_template.md +++ b/docs/resources/elasticsearch_component_template.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_component_template Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_component_template Resource" description: |- - Creates or updates a component template. + Creates or updates a component template. Component templates are building blocks for constructing index templates that specify index mappings, settings, and aliases. See the component template documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-component-template.html for more details. --- -# Resource: elasticstack_elasticsearch_component_template +# elasticstack_elasticsearch_component_template (Resource) -Creates or updates a component template. Component templates are building blocks for constructing index templates that specify index mappings, settings, and aliases. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-component-template.html +Creates or updates a component template. Component templates are building blocks for constructing index templates that specify index mappings, settings, and aliases. See the [component template documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-component-template.html) for more details. ## Example Usage @@ -64,14 +65,14 @@ Optional: - `alias` (Block Set) Alias to add. (see [below for nested schema](#nestedblock--template--alias)) - `mappings` (String) Mapping for fields in the index. Should be specified as a JSON object of field mappings. See the documentation (https://www.elastic.co/guide/en/elasticsearch/reference/current/explicit-mapping.html) for more details -- `settings` (String) Configuration options for the index. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/index-modules.html#index-modules-settings +- `settings` (String) Configuration options for the index. See the [index modules settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/index-modules.html#index-modules-settings) for more details. ### Nested Schema for `template.alias` Required: -- `name` (String) The alias name. Index alias names support date math. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/date-math-index-names.html +- `name` (String) The alias name. Index alias names support date math. See the [date math index names documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/date-math-index-names.html) for more details. Optional: @@ -108,6 +109,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_component_template.my_template / ``` diff --git a/docs/resources/elasticsearch_data_stream.md b/docs/resources/elasticsearch_data_stream.md index 6cfb9e88d..2402ab784 100644 --- a/docs/resources/elasticsearch_data_stream.md +++ b/docs/resources/elasticsearch_data_stream.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_data_stream Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_data_stream Resource" description: |- - Manages Elasticsearch Data Streams + Managing Elasticsearch data streams, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html --- -# Resource: elasticstack_elasticsearch_data_stream +# elasticstack_elasticsearch_data_stream (Resource) -Manages data streams. This resource can create, delete and show the information about the created data stream. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html +Managing Elasticsearch data streams, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html ## Example Usage @@ -123,6 +124,8 @@ Read-Only: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_data_stream.my_data_stream / ``` diff --git a/docs/resources/elasticsearch_data_stream_lifecycle.md b/docs/resources/elasticsearch_data_stream_lifecycle.md index 1c4d913d8..cbb14db59 100644 --- a/docs/resources/elasticsearch_data_stream_lifecycle.md +++ b/docs/resources/elasticsearch_data_stream_lifecycle.md @@ -1,12 +1,13 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_data_stream_lifecycle Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_data_stream_lifecycle Resource" description: |- - Manages Lifecycle for Elasticsearch Data Streams + Configures the data stream lifecycle for the targeted data streams, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html --- -# Resource: elasticstack_elasticsearch_data_stream +# elasticstack_elasticsearch_data_stream_lifecycle (Resource) Configures the data stream lifecycle for the targeted data streams, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/data-stream-apis.html @@ -105,6 +106,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_data_stream_lifecycle.my_data_stream_lifecycle / ``` diff --git a/docs/resources/elasticsearch_enrich_policy.md b/docs/resources/elasticsearch_enrich_policy.md index 0fb926ffa..9e0a7ba52 100644 --- a/docs/resources/elasticsearch_enrich_policy.md +++ b/docs/resources/elasticsearch_enrich_policy.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_enrich_policy Resource - terraform-provider-elasticstack" subcategory: "Enrich" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_enrich_policy" description: |- - Managing Elasticsearch enrich policies, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-apis.html + Managing Elasticsearch enrich policies. See the enrich API documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-apis.html for more details. --- -# Resource: elasticstack_elasticsearch_enrich_policy +# elasticstack_elasticsearch_enrich_policy (Resource) -Creates or updates enrich policies, see: https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-apis.html +Managing Elasticsearch enrich policies. See the [enrich API documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/enrich-apis.html) for more details. ## Example Usage @@ -58,13 +59,13 @@ resource "elasticstack_elasticsearch_enrich_policy" "policy1" { ### Optional -- `elasticsearch_connection` (Block List, Max: 1, Deprecated) Elasticsearch connection configuration block. This property will be removed in a future provider version. Configure the Elasticsearch connection via the provider configuration instead. (see [below for nested schema](#nestedblock--elasticsearch_connection)) +- `elasticsearch_connection` (Block List, Deprecated) Elasticsearch connection configuration block. (see [below for nested schema](#nestedblock--elasticsearch_connection)) - `execute` (Boolean) Whether to call the execute API function in order to create the enrich index. - `query` (String) Query used to filter documents in the enrich index. The policy only uses documents matching this query to enrich incoming documents. Defaults to a match_all query. ### Read-Only -- `id` (String) The ID of this resource. +- `id` (String) Internal identifier of the resource ### Nested Schema for `elasticsearch_connection` @@ -90,8 +91,10 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell # NOTE: while importing index resource, keep in mind, that some of the default index settings will be imported into the TF state too # You can later adjust the index configuration to account for those imported settings terraform import elasticstack_elasticsearch_enrich_policy.policy1 / -``` \ No newline at end of file +``` diff --git a/docs/resources/elasticsearch_index.md b/docs/resources/elasticsearch_index.md index da59dd302..15e5f3377 100644 --- a/docs/resources/elasticsearch_index.md +++ b/docs/resources/elasticsearch_index.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_index Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_index Resource" description: |- - Creates or updates an index. + Creates Elasticsearch indices. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-create-index.html --- -# Resource: elasticstack_elasticsearch_index +# elasticstack_elasticsearch_index (Resource) -Creates or updates an index. This resource can define settings, mappings and aliases. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-create-index.html +Creates Elasticsearch indices. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-create-index.html ## Example Usage @@ -190,13 +191,10 @@ Required: ## Import -**NOTE:** While importing index resource, keep in mind, that some of the default index settings will be imported into the TF state too. -You can later adjust the index configuration to account for those imported settings. - -Some of the default settings, which could be imported are: `index.number_of_replicas`, `index.number_of_shards` and `index.routing.allocation.include._tier_preference`. - Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell # NOTE: while importing index resource, keep in mind, that some of the default index settings will be imported into the TF state too # You can later adjust the index configuration to account for those imported settings diff --git a/docs/resources/elasticsearch_index_lifecycle.md b/docs/resources/elasticsearch_index_lifecycle.md index 99c241035..efbffa2ff 100644 --- a/docs/resources/elasticsearch_index_lifecycle.md +++ b/docs/resources/elasticsearch_index_lifecycle.md @@ -1,12 +1,13 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_index_lifecycle Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_index_lifecycle Resource" description: |- - Creates or updates lifecycle policy. + Creates or updates lifecycle policy. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ilm-put-lifecycle.html and https://www.elastic.co/guide/en/elasticsearch/reference/current/ilm-index-lifecycle.html --- -# Resource: elasticstack_elasticsearch_index_lifecycle +# elasticstack_elasticsearch_index_lifecycle (Resource) Creates or updates lifecycle policy. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ilm-put-lifecycle.html and https://www.elastic.co/guide/en/elasticsearch/reference/current/ilm-index-lifecycle.html @@ -434,6 +435,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_index_lifecycle.my_ilm / ``` diff --git a/docs/resources/elasticsearch_index_template.md b/docs/resources/elasticsearch_index_template.md index 17cc732de..e6cc51aa6 100644 --- a/docs/resources/elasticsearch_index_template.md +++ b/docs/resources/elasticsearch_index_template.md @@ -1,12 +1,13 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_index_template Resource - terraform-provider-elasticstack" subcategory: "Index" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_index_template Resource" description: |- - Creates or updates an index template. + Creates or updates an index template. Index templates define settings, mappings, and aliases that can be applied automatically to new indices. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-template.html --- -# Resource: elasticstack_elasticsearch_index_template +# elasticstack_elasticsearch_index_template (Resource) Creates or updates an index template. Index templates define settings, mappings, and aliases that can be applied automatically to new indices. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-template.html @@ -58,6 +59,7 @@ resource "elasticstack_elasticsearch_index_template" "my_data_stream" { - `composed_of` (List of String) An ordered list of component template names. - `data_stream` (Block List, Max: 1) If this object is included, the template is used to create data streams and their backing indices. Supports an empty object. (see [below for nested schema](#nestedblock--data_stream)) - `elasticsearch_connection` (Block List, Max: 1, Deprecated) Elasticsearch connection configuration block. This property will be removed in a future provider version. Configure the Elasticsearch connection via the provider configuration instead. (see [below for nested schema](#nestedblock--elasticsearch_connection)) +- `ignore_missing_component_templates` (List of String) A list of component template names that are ignored if missing. - `metadata` (String) Optional user metadata about the index template. - `priority` (Number) Priority to determine index template precedence when a new data stream or index is created. - `template` (Block List, Max: 1) Template to be applied. It may optionally include an aliases, mappings, lifecycle, or settings configuration. (see [below for nested schema](#nestedblock--template)) @@ -135,6 +137,8 @@ Required: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_index_template.my_template / ``` diff --git a/docs/resources/elasticsearch_ingest_pipeline.md b/docs/resources/elasticsearch_ingest_pipeline.md index 53f4c94dd..22d371057 100644 --- a/docs/resources/elasticsearch_ingest_pipeline.md +++ b/docs/resources/elasticsearch_ingest_pipeline.md @@ -1,24 +1,24 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_ingest_pipeline Resource - terraform-provider-elasticstack" subcategory: "Ingest" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_ingest_pipeline Resource" description: |- - Manages Ingest Pipelines + Manages tasks and resources related to ingest pipelines and processors. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-apis.html --- -# Resource: elasticstack_elasticsearch_ingest_pipeline +# elasticstack_elasticsearch_ingest_pipeline (Resource) -Use ingest APIs to manage tasks and resources related to ingest pipelines and processors. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-apis.html +Manages tasks and resources related to ingest pipelines and processors. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-apis.html ## Example Usage -You can provide your custom JSON definitions for the ingest processors: - ```terraform provider "elasticstack" { elasticsearch {} } +// You can provide the ingest pipeline processors as plain JSON objects. resource "elasticstack_elasticsearch_ingest_pipeline" "my_ingest_pipeline" { name = "my_ingest_pipeline" description = "My first ingest pipeline managed by Terraform" @@ -43,12 +43,8 @@ EOF , ] } -``` - -Or you can use data sources and Terraform declarative way of defining the ingest processors: - -```terraform +// Or you can use the provided data sources to create the processor data sources. data "elasticstack_elasticsearch_ingest_processor_set" "set_count" { field = "count" value = 1 @@ -69,7 +65,6 @@ resource "elasticstack_elasticsearch_ingest_pipeline" "ingest" { } ``` - ## Schema @@ -113,6 +108,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_ingest_pipeline.my_ingest_pipeline / ``` diff --git a/docs/resources/elasticsearch_logstash_pipeline.md b/docs/resources/elasticsearch_logstash_pipeline.md index bf7f49394..e89379a50 100644 --- a/docs/resources/elasticsearch_logstash_pipeline.md +++ b/docs/resources/elasticsearch_logstash_pipeline.md @@ -1,14 +1,15 @@ + --- +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_logstash_pipeline Resource - terraform-provider-elasticstack" subcategory: "Logstash" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_logstash_pipeline Resource" description: |- - Creates or updates centrally managed logstash pipelines. + Manage Logstash Pipelines via Centralized Pipeline Management. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/logstash-apis.html --- -# Resource: elasticstack_elasticsearch_logstash_pipeline +# elasticstack_elasticsearch_logstash_pipeline (Resource) -Creates or updates centrally managed logstash pipelines. See: https://www.elastic.co/guide/en/elasticsearch/reference/current/logstash-apis.html +Manage Logstash Pipelines via Centralized Pipeline Management. See, https://www.elastic.co/guide/en/elasticsearch/reference/current/logstash-apis.html ## Example Usage @@ -112,6 +113,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_logstash_pipeline.example / ``` diff --git a/docs/resources/elasticsearch_script.md b/docs/resources/elasticsearch_script.md index f551360af..9d2920b17 100644 --- a/docs/resources/elasticsearch_script.md +++ b/docs/resources/elasticsearch_script.md @@ -1,14 +1,15 @@ + --- -subcategory: "Cluster" -layout: "" -page_title: "Elasticstack: elasticstack_elasticsearch_script Resource" +# generated by https://github.com/hashicorp/terraform-plugin-docs +page_title: "elasticstack_elasticsearch_script Resource - terraform-provider-elasticstack" +subcategory: "Elasticsearch" description: |- - Creates or updates a stored script or search template. + Creates or updates a stored script or search template. See the create stored script API documentation https://www.elastic.co/guide/en/elasticsearch/reference/current/create-stored-script-api.html for more details. --- -# Resource: elasticstack_elasticsearch_script +# elasticstack_elasticsearch_script (Resource) -Creates or updates a stored script or search template. See https://www.elastic.co/guide/en/elasticsearch/reference/current/create-stored-script-api.html +Creates or updates a stored script or search template. See the [create stored script API documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/create-stored-script-api.html) for more details. ## Example Usage @@ -54,12 +55,12 @@ resource "elasticstack_elasticsearch_script" "my_search_template" { ### Optional - `context` (String) Context in which the script or search template should run. -- `elasticsearch_connection` (Block List, Max: 1, Deprecated) Elasticsearch connection configuration block. This property will be removed in a future provider version. Configure the Elasticsearch connection via the provider configuration instead. (see [below for nested schema](#nestedblock--elasticsearch_connection)) +- `elasticsearch_connection` (Block List, Deprecated) Elasticsearch connection configuration block. (see [below for nested schema](#nestedblock--elasticsearch_connection)) - `params` (String) Parameters for the script or search template. ### Read-Only -- `id` (String) The ID of this resource. +- `id` (String) Internal identifier of the resource ### Nested Schema for `elasticsearch_connection` @@ -85,6 +86,8 @@ Optional: Import is supported using the following syntax: +The [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import) can be used, for example: + ```shell terraform import elasticstack_elasticsearch_script.my_script /