# Development
This part of the Wiki is dedicated to developers who want to contribute to the project.
A concise overview of the architecture of the application is given and the key concepts when developing new plugins, services or contributing on tests are explained. It acts as a starting guide to get you up and running in no time.
## Setup the developing environment
Clone the [development branch](https://github.com/comlet/releasefab/tree/develop) from the ReleaseFab GitHub repository to obtain a local copy of the project. The following instructions assume the repository was cloned to a folder called "ReleaseFab". All paths are relative to that directory.
### Building the application
The application is built with Gradle. The Gradle project consists of a root project and a subproject per Java module. This makes the project a standard Gradle multi-project build.
The build is started via the `gradlew.bat` script on Windows machines and the `gradlew.sh` script on unix-based machines. There are multiple targets available, but to build the application the command
gradlew.bat build
on Windows and
gradlew.sh build
on linux-based machines is sufficient.
### Build Targets
Common targets are `createDelivery` together with `copyFiles` and `createJlinkDelivery` to package all artifacts and resources together in a deliverable folder. `createDelivery` keeps the modules separated in different JARs. `createJlinkDelivery` links all modules defined in `application/build.gradle` together with the required modules from the JDK to produce a custom runtime image which does not require a separate JRE to run.
| Custom Target | Description |
|------------------------|----------------------------------------------------------------------------------------------------------------------|
| createDelivery | Copy all jarred artifacts to the delivery folder |
| copyFiles | Copy all required files to the delivery folder |
| createJlinkDelivery | Link all defined modules together and reate a custom runtime image |
| ci_cd | Run build target and sonarqube (only for Cloud build - [setup SonarLint](#configuring-sonarlint) for local development) |
| checkstyle | Run checkstyle analysis on *.java files except the module descriptors |
| checkstyleMain | Run checkstyle on the main.java source set |
| checkstyleTest | Run checkstyle on the test.java source set |
| doxygen | Generate JavaDoc in HTML format using doxygen |
| removeTempDoxygenFiles | Delete temporary files generated by doxygen from the repository |
| test | Build the application and run unit-tests |
| eclipse | Create Eclipse project files within the repository |
| cleanEclipse | Remove the folder "Eclipse-ReleaseFab" |
| createEclipseWorkspace | Run eclipse task and copy the output into the folder "Eclipse-ReleaseFab" to [import the project into an IDE](#using-eclipse) |
| deleteTempEclipseFiles | Remove the files generated by the "eclipse" target which otherwise pollute the Git repository |
| jlink | Runs the "build" target and packages the application into a runtime image |
The standard Java Gradle targets are described by running `gradlew.bat|sh tasks`. Further documentation on these tasks is provided by [Gradle](https://docs.gradle.org/current/userguide/java_plugin.html).
### Build Parameters
When building the application including an ALM plugin, the parameters containing ALM should be passed in order to run integration tests against such plugins.
The following custom parameters are supported by the Gradle Build and can be passed when running the `build` target:
| Option | Type | Description | Default |
|------------------------------|---------|--------------------------------------------------------|---------|
| ALM_USER | String | User name for test of ALM Service | None |
| ALM_PASSWORD | String | Password for test of ALM Service | None |
| ALM_SERVER | String | ALM Server URL for tests | None |
| ALM_CERTIFICATE_PATH_WINDOWS | String | Path to certificate of ALM server on Windows machines | None |
| ALM_CERTIFICATE_PATH_LINUX | String | Path to certificate of ALM server on Linux machines | None |
| ENABLE_ALM_TESTS | Boolean | Enable tests of ALM service | None |
| PLATFORM | String | Build for Windows or Linux (SWT is platform dependent) | windows |
**_NOTE:_** Gradle requires these options to be set with a leading `-P`.
Every file that is produced by the Gradle build is saved to the `ReleaseFab_products` folder which is created alongside the ReleaseFab folder. Subdirectories exist for the compiled classes, JUnit test reports and packaged deliveries including all of the necessary scripts.
### Using Eclipse
Under normal circumstances a project which is built using Gradle can be imported with the Gradle Plugin for Eclipse called Gradle Buildship. That plugin currently does not support builds consisting of multiple projects together with the Java Platform Module System and the modulepath ([GitHub issue](https://github.com/eclipse/buildship/issues/658)).
Alternatively the Eclipse plugin for Gradle can be used, which does the same. It creates Eclipse `.project` and `.classpath` files representing the Gradle structure. The content of these files is configurable from the Gradle build scripts unlike when Gradle Buildship is used.
When running `createEclipseWorkspace` the files are correctly configured, created and copied from the source folders to the `../ReleaseFab-Eclipse` directory. This directory can be opened as an Eclipse Workspace at startup. Afterwards all projects can be imported by using `File -> Import -> General -> Existing projects into workspace`. In the wizard choose the path `../ReleaseFab-Eclipse` as the directory containing the projects. After applying all changes, the projects show up in the Eclipse Project Explorer.
This approach keeps the local repository clean from IDE specific files. Otherwise it would be necessary to exclude those files in the .gitignore file.
#### First Start
Next a `Run Configuration` needs to be defined. Eclipse will pick up all dependencies required to launch the application, but some parameters still need to be configured.
Therefore go to the Main class in the `de.comlet.releasefab` package of the ReleaseFab_Application project and select `Run as... -> Java Application`. After the command has been executed go to `Run -> Run Configurations` and select the `Main` run configuration.
In the `Arguments` tab enter `source=` and the absolute path to the root directory of the local ReleaseFab repository. This path also needs to be set as the `Working directory` further down on the same tab. Next enter `generalsettings=` and the absolute path to the main settings file `application/settings.xml` in the local repository.
If you follow these steps the application should launch correctly and display its own version information.
### Using intelliJ IDEA
Just like Eclipse intelliJ IDEA has difficulties importing Gradle multi project builds the correct way without polluting the repository. For intelliJ IDEA there is also a Gradle plugin, which is not used, because intelliJ IDEA can import existing Eclipse projects. In order to do that follow these steps:
1. Run the build target `createEclipseWorkspace`. This will create Eclipse specific files in the `../ReleaseFab-Eclipse` directory as mentioned above. This directory can be imported into intelliJ IDEA.
1. Open IDEA and select `New -> Empty Project`. Choose any directory except `../ReleaseFab-Eclipse` for the empty project and click `OK`. Close all Popup windows.
1. Next select `File -> New -> Project from existing source` and choose the directory `../ReleaseFab-Eclipse` to import from.
1. In the next window use the option `Import project from external model` and choose `Eclipse` and NOT `Gradle`.
1. The options on the next page should be set correctly, so click next.
1. All projects included in ReleaseFab should show up, so the selection can be confirmed. The project code style can be changed later, so the default is accepted for now. On the next page choose whatever JDK should be used and fulfills the criteria mentioned in the [README](https://github.com/comlet/releasefab#readme). A click on `Finish` imports all of the projects. They can either be opened in a new window or in the existing window. This is entirely up to you.
#### First Start
Next a `Run Configuration` needs to be defined. IDEA will pick up all dependencies required to launch the application, but some parameters still need to be configured.
Therefore select `Run as... -> Java Application`. After the command has been executed go to `Add Configuration` and select the `+` sign in the top right corner.
Choose `Application` to create a new configuration. Name it appropriately and select the `Main` class from the `de.comlet.releasefab` package of the `releasefab.application` module.
In the `Program Arguments` field enter `source=` and the absolute path to the root directory of the local ReleaseFab repository. This path also needs to be set as the `Working directory` further down. Next enter `generalsettings=` and the absolute path to the main settings file `settings.xml` in the local repository.
Apply all changes and run the newly created configuration. If you get a compilation error regarding the SWT platform, please follow the first suggestion of the IDE to add the platform specific library to the module descriptor. This is only necessary when using intelliJ IDEA.
If you follow these steps the application should launch correctly and display its own version information.
### Configuring SonarLint
Code which is part of this project is statically checked on SonarCloud as part of the CI pipeline. To make sure new code does not raise new issues and passes the Quality Gate once it is merged into the code base, new code can be checked from within the IDE using SonarLint.
Therefore follow the installation instructions for your IDE which are available [here](https://www.sonarlint.org/).
To ensure reproducability, the same set of rules used on SonarCloud must be used locally. Follow the instructions given [here](https://github.com/SonarSource/sonarlint-eclipse/wiki/Connected-Mode) to sync the rules between the local and the cloud project in Eclipse. Or use the instructions given [here](https://github.com/SonarSource/sonarlint-intellij/wiki/Bind-to-SonarQube-or-SonarCloud) to do the same in intelliJ IDEA
## Coding Styles
The general coding styles are rather minimal. However they are checked by Sonar rules defined [here](). If there are any violations of these rules your Pull Request cannot be accepted.
Furthermore there are some conventions which are not part of these rules:
- Every Java source file starts according to its content
+ ICL - Interface
+ ACL - Abstract Class
+ CCL - Class
- Source sets are separated into
+ src/main
+ src/test
- Package names start with `de.comlet.releasefab.{your content}`
## Debugging
The application is debugged from the IDE of your choice. Change the parameters of the Debug Run Configuration in order to debug errors which occur with specific configuration files or files containing version information.
## Architecture
ReleaseFab is developed using the Java Platform Module System introduced in Java 9. The application consists of the following modules:
- Core of the application
+ releasefab.application
- API layer
+ releasefab.library
- Plugins
+ releasefab.git.plugin
+ releasefab.version
+ releasefab.importantinformation
- Services
+ releasefab.git.service
- Classes
+ releasefab.git.classes
These modules only depend on the ReleaseFab library and their own external dependencies to be built and are therefore decoupled. When ReleaseFab is launched they are collected from a specific directory and put on the Java Modulepath. From there they are accessed through the [Java ServiceLoader API](https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/util/ServiceLoader.html) by the core of the application.
The dependency graph below shows the static architecture of the application as well as the links made during runtime.
Modular architecture of ReleaseFab
## Developing a new plugin
All the classes which can be implemented as part of a plugin are part of the package `de.comlet.releasefab.api.plugin` in the `releasefab.library` module. Every plugin defines a new Java Platform module and therefore needs to declare a [JPMS module descriptor](). In that file a module declares its dependencies and the implementations of abstract classes or interfaces it provides to other modules on the Java Modulepath.
A plugin at least needs to implement `ACLImportStrategy` in order to be recognized by the Core of the application. Accordingly every module descriptor needs to include the line
provides ACLImportStrategy with CCLImportPluginName;
as well as its dependency to the API-layer releasefab.library by the line
requires releasefab.library;
Afterwards the desired functionality of the plugins defined in the abstract classes and interfaces has to be implemented.
Their purpose is documented per method and is pretty self-explanatory.
## Developing a new service
There are two types of services which can be implemented. Version Control System services and Application Lifecycle Management system service.
Just like implementing a plugin every service is contained inside its own Java Platform module.
Therefore it needs to declare the classes it depends on and the ones it provides. An example is given in the paragraph above. Just replace the abstract class or interface your class provides an implementation for.
A version control system service at least needs to implement `ICLVersionControlUtility` from the package `de.comlet.releasefab.api.vcsservice`. Implementations of basic data structures like `ICLCommitContainer` and `ICLTagContainer` have to be made in a separate module in case they are utilized within the service and the plugin.
An application lifecycle management system service at least needs to implement `ICLALMUtility` from the package `de.comlet.releasefab.api.almservice`. Implementations of basic data structures like `ICLALMItemContainer` have to be made in a separate module in case they are utilized within the service and the plugin.
The purpose of the methods which need to be implemented is documented in the source code.
## Testing
Unit tests exist for some parts of the application. They are located in the same package as their class-under-test but in a separate source set as shown below.
application
|---ReleaseFab_Application
| |---src
| | |---main
| | | |---package.name
| | |---test
| | | |---package.name
| |---build.gradle
| |---settings.gradle
Every new class requires corresponding and passing unit tests when it is checked in for the first time. This is a key requirement for a merge request to be approved and included into the software. Only basic data classes, containing nothing but some member attributes with getters and setters are excluded from that rule.
Some unit test classes for existing classes in production are still missing. See item number # //Add link to item on project board. Merge requests are welcome. Adding unit tests for existing classes is also a good first issue to make yourseld comfortable with the project.
When building the project locally with a custom ALM system, the ALM specific settings must be passed to Gradle as command line parameters. The keys can be found [here](Usage).
## CI/CD Workflow
This section of the Wiki is dedicated to the branching scheme and the CI/CD pipeline. These rules should be followed as they allow a structured way to countinuously integrate and deploy changes.
### Continuous Integration into develop branch
The project requires a feature branch for every item of the project board which is worked on. These feature branches are included into the `develop` branch via Merge Requests as shown in the diagram below. For this Merge Request to be merged the following CI pipeline needs to complete successfully.
Continuous Integration into develop branch
### Continuous Integration into main branch
Changes made to the `develop` branch are merged into the default branch `main` at a given time by the project administration as shown in the diagram below. The following steps of the CI pipeline are executed in that event.
Continuous Integration into main branch
### Continuous Delivery
As soon as all features which should be part of a new version are done, a Git tag is pushed to the `main` branch marking the delivery. The pipeline needs to complete the following stages successfully in order to create a new delivery.
Continuous Delivery
After these steps have completed, the documentation generated by doxygen is uploaded to a sub-directory named after the new version on the GitHub Pages branch (gh-pages). The documentation of the newest version is accessible [here](). Documentation of older versions can be found by replacing the version in the URL.
The binary artifacts are packaged together with configuration files and uploaded to the new release in the GitHub [Releases section](https://github.com/comlet/releasefab/releases). An archive of the sources of that version as well as the Release Notes generated by ReleaseFab in advance can be found there.