Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit c5ce09b

Browse files
committed
Adjusting documentation to work well with new MkDocs
1 parent eb14593 commit c5ce09b

11 files changed

Lines changed: 83 additions & 106 deletions

docs/index.md

Lines changed: 6 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -1,39 +1,17 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Introduction to utPLSQL
3+
## What is utPLSQL
44

55
utPLSQL is a Unit Testing framework for Oracle PL/SQL.
66
The framework follows industry standards and best patterns of modern Unit Testing frameworks like [JUnit](http://junit.org/junit4/) and [RSpec](http://rspec.info/)
7-
8-
- User Guide
9-
- [Installation](userguide/install.md)
10-
- [Getting Started](userguide/getting-started.md)
11-
- [Annotations](userguide/annotations.md)
12-
- [Expectations](userguide/expectations.md)
13-
- [Advanced data comparison](userguide/advanced_data_comparison.md)
14-
- [Running unit tests](userguide/running-unit-tests.md)
15-
- [Querying for test suites](userguide/querying_suites.md)
16-
- [Testing best practices](userguide/best-practices.md)
17-
- [Upgrade utPLSQL](userguide/upgrade.md)
18-
- Reporting
19-
- [Using reporters](userguide/reporters.md)
20-
- [Reporting errors](userguide/exception-reporting.md)
21-
- [Code coverage](userguide/coverage.md)
22-
- [Cheat-sheet](https://www.cheatography.com/jgebal/cheat-sheets/utplsql-v3-1-2/#downloads)
23-
- About
24-
- [Project Details](about/project-details.md)
25-
- [License](about/license.md)
26-
- [Support](about/support.md)
27-
- [Authors](about/authors.md)
28-
- [Version 2 to Version 3 Comparison](compare_version2_to_3.md)
29-
30-
# Demo project
7+
8+
## Demo project
319

3210
Have a look at our [demo project](https://github.com/utPLSQL/utPLSQL-demo-project/).
3311

3412
It uses [Travis CI](https://travis-ci.org/utPLSQL/utPLSQL-demo-project) to build on every commit, runs all tests, publishes test results and code coverage to [SonarCloud](https://sonarcloud.io/project/overview?id=utPLSQL:utPLSQL-demo-project).
3513

36-
# Three steps
14+
## Three steps
3715

3816
With just three simple steps you can define and run your unit tests for PLSQL code.
3917

@@ -48,7 +26,7 @@ Here is how you can simply create tested code, unit tests and execute the tests
4826
Check out the sections on [annotations](userguide/annotations.md) and [expectations](userguide/expectations.md) to see how to define your tests.
4927

5028

51-
# Command line
29+
## Command line
5230

5331
You can use the utPLSQL command line client [utPLSQL-cli](https://github.com/utPLSQL/utPLSQL-cli) to run tests without the need for Oracle Client or any IDE like SQLDeveloper/TOAD etc.
5432

@@ -60,7 +38,7 @@ Amongst many benefits they provide ability to:
6038
Download the [latest client](https://github.com/utPLSQL/utPLSQL-cli/releases/latest) and you are good to go.
6139
See [project readme](https://github.com/utPLSQL/utPLSQL-cli/blob/develop/README.md) for details.
6240

63-
# Coverage
41+
## Coverage
6442

6543
If you want to have code coverage gathered on your code , it's best to use `ut_run` to execute your tests with multiple reporters and have both test execution report as well as coverage report saved to a file.
6644

docs/userguide/annotations.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,5 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Annotations
4-
53
Annotations are used to configure tests and suites in a declarative way similar to modern OOP languages. This way, test configuration is stored along with the test logic inside the test package.
64
No additional configuration files or tables are needed for test cases. The annotation names are based on popular testing frameworks such as JUnit.
75
The framework runner searches for all the suitable annotated packages, automatically configures suites, forms the suite hierarchy, executes it and reports results in specified formats.
@@ -38,7 +36,7 @@ There **can not** be any empty lines or comments between annotation line and pro
3836
There can be many annotations for a procedure.
3937

4038
Valid procedure annotations example:
41-
```sql
39+
```sql linenums="1"
4240
package test_package is
4341
--%suite
4442

docs/userguide/best-practices.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,5 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Best Practices
4-
53
The following are best practices we at utPLSQL have learned about PL/SQL and Unit Testing.
64

75
## Test Isolation and Dependency

docs/userguide/coverage.md

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Coverage
43
utPLSQL comes with a built-in coverage reporting engine. The code coverage reporting uses package DBMS_PROFILER (and DBMS_PLSQL_CODE_COVERAGE on Oracle database version 12.2 and above) provided with Oracle database.
54
Code coverage is gathered for the following source types:
5+
66
* package bodies
77
* type bodies
88
* triggers
@@ -15,6 +15,7 @@ Code coverage is gathered for the following source types:
1515
1616
To obtain information about code coverage for unit tests, run utPLSQL with one of built-in code coverage reporters.
1717
The following code coverage reporters are supplied with utPLSQL:
18+
1819
* `ut_coverage_html_reporter` - generates a HTML coverage report providing summary and detailed information on code coverage. The HTML reporter is based on the open-source [simplecov-html](https://github.com/colszowka/simplecov-html) reporter for Ruby. It includes source code of the code that was covered (if the code is accessible for test user)
1920
* `ut_coveralls_reporter` - generates a [Coveralls compatible JSON](https://coveralls.zendesk.com/hc/en-us/articles/201774865-API-Introduction) coverage report providing detailed information on code coverage with line numbers. This coverage report is designed to be consumed by cloud services like [Coveralls](https://coveralls.io)
2021
* `ut_coverage_sonar_reporter` - generates a [Sonar Compatible XML](https://docs.sonarqube.org/latest/analysis/generic-test/) coverage report providing detailed information on code coverage with line numbers. This coverage report is designed to be consumed by services like [SonarQube](https://www.sonarqube.org/) and [SonarCloud](https://about.sonarcloud.io/)
@@ -23,6 +24,7 @@ The following code coverage reporters are supplied with utPLSQL:
2324
## Security model
2425
utPLSQL code coverage uses DBMS_PROFILER to gather information about the execution of code under test and therefore follows the [DBMS_PROFILER's Security Model](https://docs.oracle.com/database/121/ARPLS/d_profil.htm#ARPLS67465).
2526
In order to be able to gather coverage information, the user executing unit tests needs to be either:
27+
2628
* The owner of the code that is being tested
2729
* Have the following privileges to be able to gather coverage on code owned by other users:
2830
* `create any procedure` system privilege
@@ -144,6 +146,7 @@ You may specify both _include_ and _exclude_ options to gain more control over w
144146

145147
**Important notes**
146148
The order of priority is for evaluation of include/exclude filter parameters is as follows.
149+
147150
- if `a_source_file_mappings` is defined then all include/exclude parameters are ignored (see section below for usage of `a_source_file_mappings` parameter )
148151
- else if `a_include_schema_expr` or `a_include_object_expr` parameter is specified then parameters `a_coverage_schemes` and `a_include_objects` are ignored
149152
- else if `a_include_objects` is specified then the coverage is gathered only on specified database objects.
@@ -300,6 +303,7 @@ They are abstracted from database, schema names, packages, procedures and functi
300303
To be able to effectively use reporters dedicated for those tools, utPLSQL provides functionality for mapping database object names to project files.
301304

302305
There are a few significant differences when running coverage on project files compared to running coverage on schema(s).
306+
303307
- Coverage is only reported on objects that were successfully mapped to project files.
304308
- Project files (database objects) that were not executed at all are not reported as fully uncovered. It is up to the consumer (Sonar/Coveralls) to determine if project file should be considered as 0% coverage or just ignored.
305309

@@ -335,6 +339,7 @@ C:
335339
```
336340

337341
By default, utPLSQL will convert file paths into database objects using the following regular expression `/(((\w|[$#])+)\.)?((\w|[$#])+)\.(\w{3})$`
342+
338343
- object owner (if it is present) is identified by the expression in the second set of brackets
339344
- object name is identified by the expression in the fourth set of brackets
340345
- object type is identified by the expression in the sixth set of brackets

docs/userguide/exception-reporting.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,11 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Exception handling and reporting
4-
53
The utPLSQL is responsible for handling exceptions wherever they occur in the test run. utPLSQL is trapping most of the exceptions so that the test execution is not affected by individual tests or test packages throwing an exception.
64
The framework provides a full stacktrace for every exception that was thrown. The stacktrace is clean and does not include any utPLSQL library calls in it.
75
To achieve rerunability, the package state invalidation exceptions (ORA-04068, ORA-04061) are not handled and test execution will be interrupted if such exceptions are encountered. This is because of how Oracle behaves on those exceptions.
86

97
Test execution can fail for different reasons. The failures on different exceptions are handled as follows:
8+
109
* A test package without body - each `--%test` is reported as failed with exception, nothing is executed
1110
* A test package with _invalid body_ - each `--%test` is reported as failed with exception, nothing is executed
1211
* A test package with _invalid spec_ - package is not considered a valid unit test package and is excluded from execution. When trying to run a test package with invalid spec explicitly, exception is raised. Only valid specifications are parsed for annotations

docs/userguide/expectations.md

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
![version](https://img.shields.io/badge/version-v3.1.13.4036--develop-blue.svg)
22

3-
# Expectation concepts
3+
## Expectation concepts
4+
45
Validation of the code under test (the tested logic of procedure/function etc.) is performed by comparing the actual data against the expected data.
56

67
utPLSQL uses expectations and matchers to perform the check on the data.
@@ -82,12 +83,14 @@ SUCCESS
8283
**Note:**
8384
> The examples in the document will be only using shortcut syntax, to keep the document brief.
8485
85-
# Using expectations
86+
## Using expectations
87+
8688
There are two ways to use expectations:
8789
- by invoking utPLSQL framework to execute suite(s) of utPLSQL tests
8890
- without invoking the utPLSQL framework - running expectations standalone
8991

9092
## Running expectations within utPLSQL framework
93+
9194
When expectations are ran as a part of a test suite, the framework tracks:
9295
- status of each expectation
9396
- outcomes (messages) produced by each expectation
@@ -166,7 +169,7 @@ When expectations are invoked outside of utPLSQL framework the outputs from expe
166169
**Note:**
167170
> The examples in the document will be only using standalone expectations, to keep the document brief.
168171
169-
# Matchers
172+
## Matchers
170173
utPLSQL provides the following matchers to perform checks on the expected and actual values.
171174

172175
- `be_between( a_upper_bound {data-type}, a_lower_bound {data-type} )`
@@ -312,7 +315,7 @@ FAILURE
312315
```
313316
Since NULL is neither *true* nor *false*, both expectations will report failure.
314317

315-
# Supported data types
318+
## Supported data types
316319

317320
The matrix below illustrates the data types supported by different matchers.
318321

@@ -336,7 +339,7 @@ The matrix below illustrates the data types supported by different matchers.
336339
| **be_within().of_()** | | | | X | X | X | X | X | | | | | | | |
337340
| **be_within_pct().of_()** | | | | | X | | | | | | | | | | |
338341

339-
# Expecting exceptions
342+
## Expecting exceptions
340343

341344
Testing is not limited to checking for happy-path scenarios. When writing tests, you often want to validate that in specific scenarios, an exception is thrown.
342345

@@ -385,7 +388,7 @@ Finished in .009229 seconds
385388
For more details see documentation of the [`--%throws` annotation.](annotations.md#throws-annotation)
386389

387390

388-
# Matchers
391+
## Matchers
389392

390393
You can choose different matchers to validate that your PL/SQL code is working as expected.
391394

@@ -1747,7 +1750,7 @@ FAILURE
17471750
at "anonymous block", line 32
17481751
```
17491752
1750-
# Comparing Json objects
1753+
## Comparing Json objects
17511754
17521755
utPLSQL is capable of comparing json data-types of `json_element_t` **on Oracle 12.2 and above**, and also `json` **on Oracle 21 and above**
17531756

docs/userguide/getting-started.md

Lines changed: 14 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ utPLSQL is designed in a way that allows you to follow
77

88
Below is an example of building a simple function with TDD.
99

10-
# Gather requirements
10+
## Gather requirements
1111

1212
We have a requirement to build a function that will return a substring of a string that is passed to the function.
1313

@@ -17,12 +17,12 @@ The function should accept three parameters:
1717
- start_position
1818
- end_position
1919

20-
# Create a test
20+
## Create a test
2121

2222
We will start from the bare minimum and move step by step, executing tests every time we make minimal progress.
2323
This way, we assure we don't jump ahead too much and produce code that is untested or untestable.
2424

25-
## Create test package
25+
### Create test package
2626

2727
```sql
2828
create or replace package test_betwnstr as
@@ -43,7 +43,7 @@ Finished in .451423 seconds
4343
0 tests, 0 failed, 0 errored, 0 disabled, 0 warning(s)
4444
```
4545

46-
## Define specification for the test
46+
### Define specification for the test
4747

4848
```sql
4949
create or replace package test_betwnstr as
@@ -76,7 +76,7 @@ Finished in .509673 seconds
7676

7777
Well our test is failing as the package specification requires a body.
7878

79-
## Define body of first test
79+
### Define body of first test
8080

8181
```sql
8282
create or replace package body test_betwnstr as
@@ -110,9 +110,9 @@ Finished in .415851 seconds
110110
Our test is failing as the test suite package body is invalid.
111111
Looks like we need to define the function we want to test.
112112

113-
# Implement code to fulfill the requirement
113+
## Implement code to fulfill the requirement
114114

115-
## Define tested function
115+
### Define tested function
116116

117117
```sql
118118
create or replace function betwnstr( a_string varchar2, a_start_pos integer, a_end_pos integer ) return varchar2
@@ -143,7 +143,7 @@ Finished in .375178 seconds
143143
So now we see that our test works but the function does not return the expected results.
144144
Let us fix this and continue from here.
145145

146-
## Fix the tested function
146+
### Fix the tested function
147147

148148
The function returned a string one character short, so we need to add 1 to the substr parameter.
149149

@@ -169,14 +169,14 @@ Finished in .006077 seconds
169169

170170
So our test is now passing, great!
171171

172-
# Refactor
172+
## Refactor
173173

174174
Once our tests are passing, we can safely refactor (restructure) the code as we have a safety harness
175175
in place to ensure that after the restructuring and cleanup of the code, everything is still working.
176176

177177
One thing worth mentioning is that refactoring of tests is as important as refactoring of code. Maintainability of both is equally important.
178178

179-
# Further requirements
179+
## Further requirements
180180

181181
It seems like our work is done. We have a function that returns a substring from start position to end position.
182182
As we move through the process of adding tests, it's very important to think about edge cases.
@@ -195,7 +195,7 @@ Here is a list of edge cases for our function:
195195
We should define expected behavior for each of these edge cases.
196196
Once defined we can start implementing tests for those behaviors and adjust the tested function to meet the requirements specified in the tests.
197197

198-
## Add test for additional requirement
198+
### Add test for additional requirement
199199

200200
A new requirement was added:
201201
Start position zero - should be treated as start position one
@@ -250,7 +250,7 @@ Finished in .232584 seconds
250250

251251
Looks like our function does not work as expected for zero start position.
252252

253-
## Implementing the requirement
253+
### Implementing the requirement
254254

255255
Let's fix our function so that the new requirement is met
256256

@@ -281,7 +281,7 @@ Finished in .012718 seconds
281281

282282
Great! We have made some visible progress.
283283

284-
## Refactoring
284+
### Refactoring
285285

286286
When all tests are passing we can proceed with a safe cleanup of our code.
287287

@@ -308,7 +308,7 @@ Finished in .013739 seconds
308308
2 tests, 0 failed, 0 errored, 0 disabled, 0 warning(s)
309309
```
310310

311-
# Remaining requirements
311+
## Remaining requirements
312312

313313
You may continue on with the remaining edge cases from here.
314314

0 commit comments

Comments
 (0)