diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 415078500d6e2eb..c2b78cb75a120c0 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -90,6 +90,7 @@ /src/content/docs/registrar/ @dcpena @cloudflare/pcx-technical-writing /src/content/docs/rules/ @pedrosousa @cloudflare/pcx-technical-writing /src/content/docs/ruleset-engine/ @pedrosousa @cloudflare/pcx-technical-writing +/src/content/docs/log-explorer/ @angelampcosta @cloudflare/pcx-technical-writing # Developer Platform diff --git a/public/__redirects b/public/__redirects index 765ac0a71a3d472..868a56ebbcd57e3 100644 --- a/public/__redirects +++ b/public/__redirects @@ -188,6 +188,7 @@ /analytics/graphql-api/tutorials/build-your-own-analytics/ /analytics/graphql-api/tutorials/ 301 /analytics/graphql-api/tutorials/export-graphql-to-csv/ /analytics/graphql-api/tutorials/ 301 /analytics/analytics-integrations/google-cloud/ /analytics/analytics-integrations/ 301 +/analytics/dashboards/ /log-explorer/custom-dashboards/ 301 # email-security /email-security/reporting/search/detection-search/ /email-security/reporting/search/ 301 @@ -933,6 +934,7 @@ /logs/reference/logpush-api-configuration/filters/ /logs/reference/filters/ 301 # Non-slashed version is being used in the Cloudflare dashboard /logs/reference/logpush-api-configuration/examples/example-logpush-curl/ /logs/tutorials/examples/example-logpush-curl/ 301 +/logs/log-explorer/ /log-explorer/log-search/ 301 # magic-firewall /magic-firewall/reference/examples/ /magic-firewall/how-to/add-rules/ 301 diff --git a/src/assets/images/log-explorer/supported-sql-grammar-graph.png b/src/assets/images/log-explorer/supported-sql-grammar-graph.png new file mode 100644 index 000000000000000..490442533dd89ae Binary files /dev/null and b/src/assets/images/log-explorer/supported-sql-grammar-graph.png differ diff --git a/src/content/docs/log-explorer/api.mdx b/src/content/docs/log-explorer/api.mdx new file mode 100644 index 000000000000000..510afb806c33743 --- /dev/null +++ b/src/content/docs/log-explorer/api.mdx @@ -0,0 +1,82 @@ +--- +pcx_content_type: reference +title: Log Explorer API +sidebar: + order: 5 +--- + +Configuration and Log searches are also available via a public API. + +## Authentication + +Log Explorer is available to users with the following permissions: + +- **Logs Edit**: users with Logs Edit permissions can enable datasets. +- **Logs Read**: users with Logs Read permissions can run queries via the UI or API. + +Note that these permissions exist at the account and zone level and you need the appropriate permission level for the datasets you wish to query. + +Authentication with the API can be done via an API token or API key with an email. Refer to [Create API token](/fundamentals/api/get-started/create-token/) for further instructions. + +## Query data + +Log Explorer includes a SQL API for submitting queries. + +For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), use the following SQL query: + +```bash +curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/query/sql \ +--header "Authorization: Bearer " \ +--url-query query="SELECT clientRequestScheme, clientRequestHost, clientRequestMethod, edgeResponseStatus, clientRequestUserAgent FROM http_requests WHERE RayID = '806c30a3cec56817' LIMIT 1" +``` + +This command returns the following HTTP request details: + +```json +{ + "result": [ + { + "clientrequestscheme": "https", + "clientrequesthost": "example.com", + "clientrequestmethod": "GET", + "clientrequestuseragent": "curl/7.88.1", + "edgeresponsestatus": 200 + } + ], + "success": true, + "errors": [], + "messages": [] +} +``` + +As another example, you could find Cloudflare Access requests with selected columns from a specific timeframe by performing the following SQL query: + +```bash +curl https://api.cloudflare.com/client/v4/account/{account_id}/logs/explorer/query/sql \ +--header "Authorization: Bearer " \ +--url-query query="SELECT CreatedAt, AppDomain, AppUUID, Action, Allowed, Country, RayID, Email, IPAddress, UserUID FROM access_requests WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z'" +``` + +This command returns the following request details: + +```json +{ + "result": [ + { + "createdat": "2025-01-14T18:17:55Z", + "appdomain": "example.com", + "appuuid": "a66b4ab0-ccdf-4d60-a6d0-54a59a827d92", + "action": "login", + "allowed": true, + "country": "us", + "rayid": "90fbb07c0b316957", + "email": "user@example.com", + "ipaddress": "1.2.3.4", + "useruid": "52859e81-711e-4de0-8b31-283336060e79" + } + ], + "success": true, + "errors": [], + "messages": [] +} +``` \ No newline at end of file diff --git a/src/content/docs/log-explorer/custom-dashboards.mdx b/src/content/docs/log-explorer/custom-dashboards.mdx new file mode 100644 index 000000000000000..9615305b66bc080 --- /dev/null +++ b/src/content/docs/log-explorer/custom-dashboards.mdx @@ -0,0 +1,103 @@ +--- +pcx_content_type: reference +title: Custom dashboards +sidebar: + order: 3 +--- + +Custom dashboards allow you to create tailored dashboards to monitor application security, performance, and usage. You can create monitors for ongoing monitoring of a previous incident, use them to identify indicators of suspicious activity, and access templates to help you get started. + +:::note +Enterprise customers can create up to 100 dashboards. + +Customers on Pro and Business plans can create up to 5 dashboards. +::: + +Dashboards provide a visual interface that displays key metrics and analytics, helping you monitor and analyze data efficiently. Different dashboards serve different purposes. For example, a security dashboard tracks attack attempts and threats, a performance dashboard monitors API latency and uptime, and a usage dashboard analyzes traffic patterns and user behavior. + +Different metrics serve distinct roles in providing insights into your application's performance. For example, total HTTP requests offer an overview of traffic volume, while average response time helps assess application speed. Additionally, usage metrics such as traffic patterns and user behavior provide insight into how users interact with your application. These metrics together enable you to spot trends, identify problems, and make informed, data-driven decisions. + +## Create a new dashboard + +To create a new dashboard: + +1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account. +2. Go to **Log Explorer** > **Dashboards**. + +When creating a dashboard, you have two options: building one from scratch or using a pre-designed template. + +- A **templates** provide a faster way to set up a dashboard with commonly used metrics and charts. They are useful for standard use cases, such as monitoring security threats, API performance, or bot traffic. Templates help you get started quickly while still allowing modifications to fit your requirements. +- On the other hand, **from-scratch dashboard** gives you full control over its structure, allowing you to choose the exact datasets, metrics, and visualizations that fit your needs. This approach is ideal if you have specific monitoring goals or need a highly customized view of your data. + +Choosing between these options depends on whether you need a quick setup with predefined insights or a fully customized dashboard tailored to your unique analysis needs. + +### Create a dashboard from scratch + +When creating a dashboard from scratch, select the option **Create new**. You can follow the instructions in the following sections to start adding charts to your dashboard. + +#### Create a new chart + +To create a new chart, select **Add chart**. There are two ways to create a chart: + +- **Use a prompt**: Enter a query like `Compare status code ranges over time.` The AI model decides the most appropriate visualization and constructs your chart configuration. +- **Customize your chart**: Select the chart elements manually, including the chart type, title, dataset to query, metrics, and filters. This option gives you full control over your chart's structure. + +Refer to the following sections for more information about the charts, datasets, fields, metrics, and filters available. + +##### Chart types + +The available chart types include: + +- **Timeseries**: Displays trends over time, enabling comparisons across multiple series. +- **Categorical**: Compares proportions across different series. +- **Stat**: Highlights a single value, showing its delta and sparkline for quick insights. +- **Percentage**: Represents one value as a percentage of another. +- **Top N**: Identifies the highest-ranking values for a given attribute. + +##### Datasets and metrics + +The available metrics and filters vary based on the dataset you want to use. For example, when using the HTTP Requests dataset, you can select **origin response duration** as a metric. You can then choose your preferred aggregation method for that metric, such as total, median, or quantiles. The following table outlines the datasets, fields, and available metrics: + + +| Dataset | Field | Definition | Metrics | +|-----------------|-----------------|------------|---------| +| HTTP Requests | Requests | The number of requests sent by a client to a server over the HTTP protocol. | Total | +| | DNS Response Time | The time taken for a DNS query to be resolved, measured from when a request is made to when a response is received. | Total, Average, Median, 95th percentile, 99th percentile | +| | Time to First Byte | The duration from when a request is made to when the first byte of the response is received from the server. | Total, Average, Median, 95th percentile, 99th percentile | +| | Bytes returned to the Client | The amount of data (in bytes) sent from the server to the client in response to requests. | Total, Average, Median, 95th percentile, 99th percentile | +| | Number of visits | Unique visits or sessions to a website or application. | Total | +| | Origin response duration | The time taken by the origin server to process and respond to a request. | Total, Average, Median, 95th percentile, 99th percentile | +| Security Events | Security events | Actions taken by Application Security products such as WAF and Bot Management. | Total | + +##### Filters + +You can also adjust the scope of your analytics by entering filter conditions. This allows you to focus on the most relevant data. + +1. Select **Add filter**. +2. Select a **field**, an **operator**, and a **value**. For example, to filter events by source IP address, select the _Source IP_ field, select the _equals_ operator, and enter the IP address. +3. Select **Apply**. + +### Create a dashboard from a template + +Alternatively, you can choose to create your dashboard using a pre-designed dashboard template. The templates available are: + +- **Bot monitoring**: Allows you to identify automated traffic accessing your website. +- **API Security**: Allows you to monitor data transfers and exceptions for API endpoints in your application. +- **Account takeover**: Allows you to monitor login attempts, usage of leaked credentials, and account takeover attacks. +- **API Performance**: Allows you to view timing data for API endpoints in your application, along with error rates. +- **Performance monitoring**: Allows you to identify slow hosts and paths on your origin server, and view time to first byte metrics over time. + +Choose one of the templates and select **Use template**. + +## Edit a dashboard or chart + +After creating your dashboard, to view your saved dashboards, select **Back to all dashboards** to access the full list. Regardless of the way you choose to create your dashboard, you can always edit existing charts and add new ones as needed. + +## Further analysis + +For each chart, you can: + +- Review related traffic in [Security Analytics](/waf/analytics/security-analytics/). +- Explore detailed logs in [Log Explorer](/logs/log-explorer/). + +This ensures deeper insights into your application's security, performance, and usage patterns. \ No newline at end of file diff --git a/src/content/docs/log-explorer/index.mdx b/src/content/docs/log-explorer/index.mdx new file mode 100644 index 000000000000000..8ad7a9ad56ec279 --- /dev/null +++ b/src/content/docs/log-explorer/index.mdx @@ -0,0 +1,49 @@ +--- +title: Log Explorer +pcx_content_type: overview +sidebar: + order: 1 +--- + +import { Description, Feature, RelatedProduct } from "~/components" + + +Store and explore your Cloudflare logs directly within the Cloudflare dashboard or API. + + +Log Explorer is Cloudflare's native observability and forensics product that enables security teams and developers to analyze, investigate, and monitor issues directly from the Cloudflare dashboard, without the expense and complexity of forwarding logs to third-party tools. + +Log Explorer provides access to Cloudflare logs with all the context available within the Cloudflare platform. You can monitor security and performance issues with custom dashboards or investigate and troubleshoot issues with log search. Benefits include: + +- **Reduced cost and complexity**: Drastically reduce the expense and operational overhead associated with forwarding, storing, and analyzing terabytes of log data in external tools. +- **Faster detection and triage**: Access Cloudflare-native logs directly, eliminating cumbersome data pipelines and the ingest lags that delay critical security insights. +- **Accelerated investigations with full context**: Investigate incidents with Cloudflare's unparalleled contextual data, accelerating your analysis and understanding of "What exactly happened?" and "How did it happen?" +- **Minimal recovery time**: Seamlessly transition from investigation to action with direct mitigation capabilities via the Cloudflare platform. + +## Features + + +Explore your Cloudflare logs directly within the Cloudflare dashboard or [API](/log-explorer/api/). + + + +Design customized views for tracking application security, performance, and usage metrics. + + + +Manage the data you want to store within Log Explorer. + + + +Manage configuration and perform queries via the API. + + +## Related products + + +Forward Cloudflare logs to third-party tools for debugging, identifying configuration adjustments, and creating analytics dashboards. + + + +Visualize the metadata collected by our products in the Cloudflare dashboard. + \ No newline at end of file diff --git a/src/content/docs/log-explorer/log-search.mdx b/src/content/docs/log-explorer/log-search.mdx new file mode 100644 index 000000000000000..ab619233d3516a6 --- /dev/null +++ b/src/content/docs/log-explorer/log-search.mdx @@ -0,0 +1,167 @@ +--- +pcx_content_type: concept +title: Log Search +sidebar: + order: 2 +--- + +import { TabItem, Tabs, Render } from "~/components"; + +Log Explorer enables you to store and explore your Cloudflare logs directly within the Cloudflare dashboard or API, giving you visibility into your logs without the need to forward them to third-party services. Logs are stored on Cloudflare's global network using the R2 object storage platform and can be queried via the dashboard or SQL API. + +## SQL queries supported + +The diagram below displays the example sql grammar for `SELECT` statements as a railroad syntax diagram: + +![Supported SQL grammar](~/assets/images/log-explorer/supported-sql-grammar-graph.png) + +Any path from left to right forms a valid query. There is a limit of 25 predicates in the `WHERE` clause. Predicates can be grouped using parenthesis. If the `LIMIT` clause is not specified, then the default limit of 10,000 is applied. The maximum number for the `LIMIT` clause is 10,000. Results are returned in descending order by time. + +Examples of queries include: + +- `SELECT * FROM table WHERE (a = 1 OR b = "hello") AND c < 25.89` +- `SELECT a, b, c FROM table WHERE d >= "GB" LIMIT 10` + +### SELECT + +The `SELECT` clause specifies the columns that you want to retrieve from the database tables. It can include individual column names, expressions, or even wildcard characters to select all columns. + +### FROM + +The `FROM` clause specifies the tables from which to retrieve data. It indicates the source of the data for the `SELECT` statement. + +### WHERE + +The `WHERE` clause filters the rows returned by a query based on specified conditions. It allows you to specify conditions that must be met for a row to be included in the result set. + +### GROUP BY + +The `GROUP BY` clause is used to group rows that have the same values into summary rows. + +### ORDER BY + +The `ORDER BY` clause is used to sort the result set by one or more columns in ascending or descending order. + +### LIMIT + +The `LIMIT` clause is used to constrain the number of rows returned by a query. It is often used in conjunction with the `ORDER BY` clause to retrieve the top `N` rows or to implement pagination. + +:::note + +Log Explorer does not support `JOIN`, `DDL`, `DML`, or `EXPLAIN` queries. + +::: + +## Use Log Explorer + +You can filter and view your logs via the Cloudflare dashboard or the API. + +1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account. +2. Go to **Log Explorer** > **Log Search**. +3. Select the **Dataset** you want to use and in **Columns** select the dataset fields. If you selected a zone scoped dataset, select the zone you would like to use. +4. Enter a **Limit**. A limit is the maximum number of results to return, for example, 50. +5. Select the **Time period** from which you want to query, for example, the previous 12 hours. +6. Select **Add filter** to create your query. Select a **Field**, an **Operator**, and a **Value**, then select **Apply**. +7. A query preview is displayed. Select **Custom SQL** to change the query. +8. Select **Run query** when you are done. The results are displayed below within the **Query results** section. + +For example, to find an HTTP request with a specific [Ray ID](/fundamentals/reference/cloudflare-ray-id/), go to **Custom SQL**, and enter the following SQL query: + +```sql +SELECT + clientRequestScheme, + clientRequestHost, + clientRequestMethod, + edgeResponseStatus, + clientRequestUserAgent +FROM http_requests +WHERE RayID = '806c30a3cec56817' +LIMIT 1 +``` + + +As another example, to find Cloudflare Access requests with selected columns from a specific timeframe you could perform the following SQL query: + +```sql +SELECT + CreatedAt, + AppDomain, + AppUUID, + Action, + Allowed, + Country, + RayID, + Email, + IPAddress, + UserUID +FROM access_requests +WHERE Date >= '2025-02-06' AND Date <= '2025-02-06' AND CreatedAt >= '2025-02-06T12:28:39Z' AND CreatedAt <= '2025-02-06T12:58:39Z' +``` + +### Save queries + +After selecting all the fields for your query, you can save it by selecting **Save query**. Provide a name and description to help identify it later. To view your saved and recent queries, select **Queries** — they will appear in a side panel where you can insert a new query, or delete any query. + +## Integration with Security Analytics + +You can also access the Log Explorer dashboard directly from the [Security Analytics dashboard](/waf/analytics/security-analytics/#logs). When doing so, the filters you applied in Security Analytics will automatically carry over to your query in Log Explorer. + +## Optimize your queries + +All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`. + +1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account. +2. Go to **Log Explorer** > **Log Search** > **Custom SQL**. +3. Enter the following SQL query: + +```sql +SELECT + clientip, + clientrequesthost, + clientrequestmethod, + clientrequesturi, + edgeendtimestamp, + edgeresponsestatus, + originresponsestatus, + edgestarttimestamp, + rayid, + clientcountry, + clientrequestpath, + date +FROM + http_requests +WHERE + date = '2023-10-12' LIMIT 500 +``` + +### Additional query optimization tips + +- Narrow your query time frame. Focus on a smaller time window to reduce the volume of data processed. This helps avoid querying excessive amounts of data and speeds up response times. +- Omit `ORDER BY` and `LIMIT` clauses. These clauses can slow down queries, especially when dealing with large datasets. For queries that return a large number of records, reduce the time frame instead of limiting to the newest `N` records from a broader time frame. +- Select only necessary columns. For example, replace `SELECT *` with the list of specific columns you need. You can also use `SELECT RayId` as a first iteration and follow up with a query that filters by the Ray IDs to retrieve additional columns. Additionally, you can use `SELECT COUNT(*)` to probe for time frames with matching records without retrieving the full dataset. + +## FAQs + +### Which fields (or columns) are available for querying? + +All fields listed in [Log Fields](/logs/reference/log-fields/) for the [supported datasets](/log-explorer/manage-datasets/#supported-datasets) are viewable in Log Explorer. For filtering, only fields with simple values, such as those of type `bool`, `int`, `float`, or `string` are supported. Fields with key-value pairs are currently not supported. For example, you cannot use the fields `RequestHeaders` and `Cookies` from the HTTP requests dataset in a filter. + +### Why does my query not complete or time out? + +Log Explorer performs best when query parameters focus on narrower ranges of time. You may experience query timeouts when your query would return a large quantity of data. Consider refining your query to improve performance. + +### Why don't I see any logs in my queries after enabling the dataset? + +Log Explorer starts ingesting logs from the moment you enable the dataset. It will not display logs for events that occurred before the dataset was enabled. Make sure that new events have been generated since enabling the dataset, and check again. + +### My query returned an error. How do I figure out what went wrong? + +We are actively working on improving error codes. If you receive a generic error, check your SQL syntax (if you are using the custom SQL feature), make sure you have included a date and a limit, and that the field you are filtering is not a key-value pair. If the query still fails it is likely timing out. Try refining your filters. + +### Where is the data stored? + +The data is stored in Cloudflare R2. Each Log Explorer dataset is stored on a per-customer level, similar to Cloudflare D1, ensuring that your data is kept separate from that of other customers. In the future, this single-tenant storage model will provide you with the flexibility to create your own retention policies and decide in which regions you want to store your data. + +### Does Log Explorer support Customer Metadata Boundary? + +Customer metadata boundary is currently not supported for Log Explorer. \ No newline at end of file diff --git a/src/content/docs/log-explorer/manage-datasets.mdx b/src/content/docs/log-explorer/manage-datasets.mdx new file mode 100644 index 000000000000000..7e39410db9f5e78 --- /dev/null +++ b/src/content/docs/log-explorer/manage-datasets.mdx @@ -0,0 +1,69 @@ +--- +pcx_content_type: reference +title: Manage datasets +sidebar: + order: 4 +--- + +import { TabItem, Tabs, Render } from "~/components"; + +Log Explorer allows you to enable or disable which datasets are available to query in Log Search. + +## Supported datasets + +Log Explorer currently supports the following datasets: + +- [HTTP requests](/logs/reference/log-fields/zone/http_requests/) (`FROM http_requests`) +- [Firewall events](/logs/reference/log-fields/zone/firewall_events/) (`FROM firewall_events`) + +## Enable Log Explorer + +In order for Log Explorer to begin storing logs, you need to enable the desired datasets. You can do this via the dashboard or the API. + +1. Log in to the [Cloudflare dashboard](https://dash.cloudflare.com/login) and select your account. +2. Go to **Log Explorer** > **Manage datasets**. +3. Select **Add dataset** to select the datasets you want to query. +4. Choose a dataset and then a zone. Then, select **Add**. You can always return to this page to enable more datasets or manage your existing ones. + +:::note +It may take a few minutes for the logs to become available for querying. +::: + +If you are using the API, Use the Log Explorer API to enable Log Explorer for each dataset you wish to store. It may take a few minutes after a log stream is enabled before you can view the logs. + +The following `curl` command is an example for enabling the zone-level dataset `http_requests`, as well as the expected response when the command succeeds. + +```bash +curl https://api.cloudflare.com/client/v4/zones/{zone_id}/logs/explorer/datasets \ +--header "Authorization: Bearer " \ +--json '{ + "dataset": "http_requests" +}' +``` + +```json output +{ + "result": { + "dataset": "http_requests", + "object_type": "zone", + "object_id": "", + "created_at": "2025-06-03T14:33:16Z", + "updated_at": "2025-06-03T14:33:16Z", + "dataset_id": "01973635f7e273a1964a02f4d4502499", + "enabled": true + }, + "success": true, + "errors": [], + "messages": [] +} +``` + +To enable an account-level dataset, replace `zones/{zone_id}` with `accounts/{account_id}` in the `curl` command. For example: + +```bash +curl https://api.cloudflare.com/client/v4/accounts/{account_id}/logs/explorer/datasets \ +--header "Authorization: Bearer " \ +--json '{ + "dataset": "access_requests" +}' +``` \ No newline at end of file diff --git a/src/content/docs/logs/log-explorer.mdx b/src/content/docs/logs/log-explorer.mdx index 0c96942fc683e15..381134f43d498f6 100644 --- a/src/content/docs/logs/log-explorer.mdx +++ b/src/content/docs/logs/log-explorer.mdx @@ -207,7 +207,7 @@ You can choose the output format with an HTTP `Accept` header, as shown in the t | CSV | `text/csv` | Yes | | Plain text | `text/plain` | Yes | -## Optimizing your queries +## Optimize your queries All the tables supported by Log Explorer contain a special column called `date`, which helps to narrow down the amount of data that is scanned to respond to your query, resulting in faster query response times. The value of `date` must be in the form of `YYYY-MM-DD`. For example, to query logs that occurred on October 12, 2023, add the following to your `WHERE` clause: `date = '2023-10-12'`. The column supports the standard operators of `<`, `>`, and `=`. diff --git a/src/content/products/log-explorer.yaml b/src/content/products/log-explorer.yaml new file mode 100644 index 000000000000000..209d009f72d513f --- /dev/null +++ b/src/content/products/log-explorer.yaml @@ -0,0 +1,12 @@ + +name: Log Explorer + +product: + title: Log Explorer + url: /log-explorer/ + group: Core platform + +meta: + title: Cloudflare Log Explorer docs + description: Store and explore your Cloudflare logs directly within the Cloudflare dashboard or API. + author: '@cloudflare' \ No newline at end of file diff --git a/src/icons/log-explorer.svg b/src/icons/log-explorer.svg new file mode 100644 index 000000000000000..60cbf6cacbafd1a --- /dev/null +++ b/src/icons/log-explorer.svg @@ -0,0 +1,4 @@ + + + + \ No newline at end of file