Add Extra Local Data Services
It is possible to add extra local data storage services in the Local Simple Runtime. A containerized test environment is great for testing and can be deleted at any time. These services can also be accessed directly via 127.0.0.1 instead of host.docker.internal.
API Route, MQ and Scheduler
Triggers - API route, MQ and scheduler - CLI commands.
Context and Task
Data Context is an object injected into logic functions during a task, which containes some key components:
Create and Deploy Data Process
CLI allows you to create data process projects locally and deploy them with console commands.
Database Agent
Connecting external databases with built-in drivers. Currently the following databases are supported:
Deploy API Route
API routes is one of the trigger features that can be deployed and managed from CLI, which make data processes RESTful-like. See Triggers commands for more details.
Deploy MQ Client, Pub and Sub
Message queue clients, publishers and subscribers together are one of the trigger features that can be deployed and managed from CLI.
Deploy Scheduler
Scheduler - schedules and scheduled jobs - together are one of the trigger features that can be deployed and managed from CLI. See Triggers commands for more details.
Emit and Inspect Events
LOC data events allows users to generate a data flow or data trail to indicate who have sent and received data. These information can be very useful for two reasons:
Event Store Agent
Emit and query LOC data events.
Events and Multiple Data Processes
This is the follow-up of Emit and Inspect Events, in which we've learned the basics of emitting events.
File Storage Agent
Accessing and writing remote files. Supports the following protocols/storage:
General, Profile and Login
General, profile and login CLI commands.
HTTP Agent
For sending HTTP requests.
Introduction
LOC v0.6 Release Note (legacy version)
Local Storage Agent
For sharing data between tasks of one data process.
Logging Agent
Log (debugging) messages to LOC.
Logic and Session
Logic are the core elements of a data process. They contains the implementation of part of a data pipeline. And the session storage is the thing to link them up together.
Mail Agent
Sending emails with a SMTP server.
Payload
Payload is the data sent by triggers as input and is accessible from the context object. Currently there are two payload types:
Project, Data Process and Logic
Project, data process and logic CLI commands.
Quick Start
This quick start will go through a tutorial of building a simple "Hello World" data process in LOC Studio using JavaScript.
Release Note
LOC v0.6 has brought some new features and improvements since v0.5.0. This is also the first available version of LOC Documentation.
Result Agent
Finalise a JSON object as the result data. If the trigger is synchronized API route or message queue, it will be returned to the trigger.
Session Storage Agent
For sharing data between logic during a task (execution) of a data process.
Setup CLI
LOC CLI is the command line tool for developers to deploy data processes and manage other assets in LOC.
Single Data Process Execution
If a data process is deployed, it can be run without using triggers. This is exactly the same as the Execution function in Studio.
Studio Overview
An overview of Studio functionalities.
Tag and Unit
Tag and unit CLI commands.
Tips on Error Handling
Some tips about handling errors in logic.
Using Local Simple Runtime
Local Simple Runtime is a Docker-based local LOC environment. It offers the following benefits: