Getting started with templated features¶
The Templated Feature Ecosystem¶
A templated feature is like a giant code snippet where the DSM Definitions are injected and recursively consumed to generate some repetitive code that implements a feature for a data model.
Templated feature may be depended on another templated feature and form an ecosystem.
The kibo (code generator) knows how to map DSM Definitions primitives types
(bool, int8, float...) and generic collections (vector
C++ templated features¶
The templated features from the folder ./templates/cpp are based on the feature
provided by the C++ Viper runtime. The features are organized by theme, and a specific
feature may be implemented by many source and headers files in a folder.
Here is a brief description of feature implements by folder.
| Folder | Description |
|---|---|
| Data | Types implementation for enum, struct, concept, club |
| Model | The DSM Definitions, Path, Field |
| Database | The persistence layer based on sqlite3 |
| Stream | The encoder/decoder for a streams |
| Json | The encoder/decoder for Json |
| Commit | The Commit API |
| CommitFunctionPool | The Viper binding for a all commit pools |
| CommitFunctionPoolCommit | The viper binding for a pool exposing the Commit API |
| CommitFunctionPoolRemote | The viper binding for a remote commit pool |
| FunctionPool | The Viper binding for all pools |
| FunctionPoolRemote | The API to call remote pools (Service) |
| ValueType | The Viper types for all DSM Definitions |
| ValueCodec | The bidirectional bridge for static and dynamic representation of a value |
| Python | Python module constants for types and paths. |
| Fuzz | The API to generate random value. |
| Test | Unit test for Fuzzing, Jsoning, Streaming and Storing. |
| TestApp | Applications for starting the unit test. |
Read Kibo Template Model to write your own templates. However, you should read the source code of the provided templates to learn how to decompose the implementation of your feature to create the giant code-snippet...
Python templated feature¶
The templated features from the folder ./templates/python are based on the feature
provided by dsviper (Python extension for Viper). The features are organized by file.
| Folder/File | Description |
|---|---|
| package/definitions | Definitions expressed in the Viper Type system |
| package/data | Classes for concepts, clubs, enumerations and structures |
| package/commit | The Commit API |
| package/database | The Database API |
| package/value_types | The Dynamic definitions of types required by the data model |
| package/commit_function_pools | Wrapped API for a all commit pools |
| package/commit_function_pool_remotes | Wrapped API for a all remote commit pools |
| package/function_pools | Wrapped API for for all pools |
| package/function_pool_remotes | Wrapped API to call remote pools (Service) |
| wheel/pyproject.toml | The minimal pyproject.toml to build a wheel |
All the features are grouped in a Python package.
The Generate step¶
When we start to use templated features, we must decide what to generate and where to put the generated code.
Each project uses the generated code for various needs at various steps.
- low-level serialization to insert in a Framework.
- unit test to insert in the test infrastructure.
- just a part of a feature to insert in a Framework.
- just a part of a feature to insert in an application.
- just a part for the Python module embedded in the application.
- a Python package embedded in a C++ application.
- a Python wheel to distribute.
- ...
We need something like a generate step for each project.
To illustrate the idea, we have extracted some code from generate.py script of
various projects to show the various use cases of the generated code.
Generate code for the exp project¶
# Extracted from exp/generate.py
...
if arguments.cpp:
# Generate a framework
print('** Render Cpp')
render_templates(namespace=NAMESPACE, dsmb_path=DSMB_PATH, output=PROJECT)
generate_resource(definitions=DEFINITIONS, output=f'{NAMESPACE}/{NAMESPACE}_Resources.hpp')
# Generate application for testing the generated coded.
generate(namespace=NAMESPACE, dsmb_path=DSMB_PATH, template='TestApp', output=".")
if arguments.package:
# Generate the Python package
print('** Render Python Package')
generate_package(name='exp', dsmb_path=DSMB_PATH, definitions=DEFINITIONS, output=f'python/exp')
...
Generate code for the Raptor Editor project¶
# Extracted from com.digitalsubstrate.red/generate.py
...
if arguments.red:
# Generate the framework Raptor
print('** Render C++ for Red')
render_templates(namespace=NAMESPACE, dsmb_path=DSMB_PATH, output=f'src/{NAMESPACE}')
generate_resource(definitions=DEFINITIONS, output=f'src/{NAMESPACE}/{NAMESPACE}_Resources.hpp')
if arguments.logic:
# Generate a part of the RaptorLogic framework
print('** Render C++ for RaptorLogic')
generate_logic(namespace='RaptorLogic', dsmb_path=DSMB_PATH, template='RaptorLogic', output=f'src/RaptorLogic')
if arguments.editor:
# Generate a part of the application
print('** Render C++ for RaptorEditor')
generate(namespace='RE', dsmb_path=DSMB_PATH, template='Python', output=f'RaptorEditor/RaptorEditor')
if arguments.python:
# Generate the python package embedded in the application
print('** Render Raptor package for RaptorEditor')
generate_package(name='red', dsmb_path=DSMB_PATH, definitions=DEFINITIONS, output=f'RaptorEditor/RaptorEditor/Scripts/red')
...
How to generate templated features¶
Here is the steps to generate features with dsviper from the generate.py script.
- You can create your own tool with the C++ Viper API to generate the code.
- You can use custom rules in your build system (cmake, ...)
1. Save a binary representation of the DSM Definitions¶
We convert the DSM Definitions to a pre-parsed binary representation required by the
kibo-1.2.0.jar (See the manual)
def save_definitions(definitions: DSMDefinitions, dsmb_path: str):
with open(dsmb_path, "wb") as binary_file:
binary_file.write(definitions.encode().encoded())
BUILDER = DSMBuilder.assemble(DSM_PATH)
REPORT, DSM_DEFINITIONS, DEFINITIONS = BUILDER.parse()
check_report(report=REPORT)
save_definitions(dsm_definitions=DSM_DEFINITIONS, dsmb_path=DSMB_PATH)
2. Generate the embedded Definitions¶
We generate the C++ encoded resource included by the C++ Feature
Model/Definitions.cpp.
# Generate the embedded representation of the Definitions
def generate_resource(definitions: DefinitionsConst, output: str):
blob = definitions.encode()
with open(f'{output}', 'w') as file:
file.write(blob.embed("definitions"))
3. Generate C++ features¶
We need to write some functions to describe the list of templated features
we need for the project and then call kibo-1.2.0.jar
Here is an example extracted from exp/generate.py.
# Call kibo for a C++ templated feature
JAR = '../tools/kibo-1.2.0.jar'
KIBO = ['java', '-jar', JAR]
def generate(namespace: str, dsmb_path: str, template: str, output: str, *args):
options = [
'-c', 'cpp',
'-n', namespace,
'-d', dsmb_path,
'-t', f'../templates/cpp/{template}',
'-o', output
]
cmd = KIBO + options + list(args)
subprocess.run(cmd)
# Generate all features
def render_templates(namespace, dsmb_path, output):
templates = [
'Model',
'Data',
'Stream', 'Json',
'Database',
'Commit',
'PoolCommit',
'ValueType', 'ValueCodec',
'Fuzz', 'Test'
]
for template in templates:
generate(namespace=namespace, dsmb_path=dsmb_path, template=template, output=output)
...
4. Generate Python features¶
For Python, we just need to create the package and generate the embedded resources for the
included by package/definitions.py
def generate_package(name: str, dsmb_path: str, definitions: DefinitionsConst, output: str):
options = [
'-c', 'python',
'-n', name,
'-d', dsmb_path,
'-t', '../templates/python/package',
'-o', output
]
cmd = KIBO + options
subprocess.run(cmd)
# Generate the embedded definition
blob = definitions.encode()
string = base64.b64encode(zlib.compress(blob.encoded()))
with open(f'{output}/resources.py', 'w') as file:
file.write(f"B64_DEFINITIONS = {string}")
What next...¶
For your project, you can copy the exp/generate.py and keep the features you need.
Or you can adapt the steps for your build system.
The steps for C++:
1) Save the dsm definitions to a binary representation.dsmb
2) Generate the resource for the Definitions included by the generated Model/Definitions.cpp.
3) Call the kibo-1.2.0.jar for each feature.
For python, you can use dsm_util.py create_python_package ...
(See Getting started with dsm_util.py)