SERAPHIM is a web server written in Pick BASIC for multivalue database environments.
It is a fully contained and is a single a file that can be downloaded and compiled for UniVerse, ScarletDME, and D3. It can likely also run on UniData and OpenQM though that hasn't been tested.
Installing SERAPHIM is as simple as downloading the file and compiling it.
wget https://raw.githubusercontent.com/Krowemoh/TCL-Utilities/main/SERAPHIM
Compile and catalog the routine:
BASIC BP SERAPHIM
CATALOG BP SERAPHIM
You can also use my package manager NPM to install it.
NPM INSTALL BP SERAPHIM
SERAPHIM expects the CONTROL-FILE file to exist. If it doesn't you can create it with the following:
CREATE-FILE CONTROL-FILE 50,2,6 100,4,18
SERAPHIM expects 2 records to exist. It requires a SERAPHIM record and a ROUTES recordi in the CONTROL-FILE.
The SERAPHIM record in the CONTROL-FILE will define the port to run the web server on and the path to any static files that need to get served.
CONTROL-FILE SERAPHIM
1. 7122
2. /path/to/static/assets/
The ROUTES record in the CONTROL-FILE will contain path and their handlers.
1. / : @VM : /devlog/ : @VM : /devlog/<page>
2. WEB.INDEX : @VM : WEB.BLOG.INDEX : @VM : WEB.BLOG.PAGE
This file defines that the above paths will have the associate subroutines run. SERAPHIM has support for url slugs as well so you can use angle brackets to pass dynamic variables.
SERAPHIM can be run in the foreground or it can be run as a background process.
The foreground:
SERAPHIM --NO-PHANTOM
The background:
SERAPHIM
If everything goes well, you should see the following lines:
Listening on: 0.0.0.0:7122
Started in non-PHANTOM mode.
SERAPHIM runs route handling subroutines with the expectation that they take 1 request paramter and return 1 response parameter.
SUBROUTINE WEB.INDEX(MAT REQUEST,MAT RESPONSE)
The REQUEST parameter that is passed in will contain all the information that SERAPHIM has parsed out. You will also get the raw REQUEST that came in.
DIM REQUEST(11)
*
EQU REQUEST.TYPE.ATTRIBUTE TO 1
EQU REQUEST.VERSION.ATTRIBUTE TO 2
EQU REQUEST.URL.ATTRIBUTE TO 3
EQU REQUEST.SLUGS.ATTRIBUTE TO 4
EQU REQUEST.QUERY.ATTRIBUTE TO 5
EQU REQUEST.HEADERS.ATTRIBUTE TO 6
EQU REQUEST.COOKIES.ATTRIBUTE TO 7
EQU REQUEST.RAW.BODY.ATTRIBUTE TO 8
EQU REQUEST.FORM.ATTRIBUTE TO 9
EQU REQUEST.JSON.ATTRIBUTE TO 10
EQU REQUEST.RAW.REQUEST.ATTRIBUTE TO 11
*
MAT REQUEST = ''
*
These are not to be updated or modified.
The RESPONSE parameter will need to be filled in by the subroutine.
DIM RESPONSE(3)
*
EQU RESPONSE.STATUS.ATTRIBUTE TO 1
EQU RESPONSE.HEADERS.ATTRIBUTE TO 2
EQU RESPONSE.CONTENT.ATTRIBUTE TO 3
*
The STATUS is the only required attribute, the CONTENT will be commonly used and you also have the ability to add headers if need be. This will be useful when managing things like log ins or adding cookie information.
A very simple routine would be the following:
SUBROUTINE WEB.LOGIN(MAT REQUEST,MAT RESPONSE)
*
EQU TRUE TO 1
EQU FALSE TO 0
*
DIM REQUEST(11)
*
EQU REQUEST.TYPE.ATTRIBUTE TO 1
EQU REQUEST.VERSION.ATTRIBUTE TO 2
EQU REQUEST.URL.ATTRIBUTE TO 3
EQU REQUEST.SLUGS.ATTRIBUTE TO 4
EQU REQUEST.QUERY.ATTRIBUTE TO 5
EQU REQUEST.HEADERS.ATTRIBUTE TO 6
EQU REQUEST.COOKIES.ATTRIBUTE TO 7
EQU REQUEST.RAW.BODY.ATTRIBUTE TO 8
EQU REQUEST.FORM.ATTRIBUTE TO 9
EQU REQUEST.JSON.ATTRIBUTE TO 10
EQU REQUEST.RAW.REQUEST.ATTRIBUTE TO 11
*
DIM RESPONSE(3)
*
EQU RESPONSE.STATUS.ATTRIBUTE TO 1
EQU RESPONSE.HEADERS.ATTRIBUTE TO 2
EQU RESPONSE.CONTENT.ATTRIBUTE TO 3
*
BEGIN CASE
CASE REQUEST(REQUEST.TYPE.ATTRIBUTE) = 'GET'
GOSUB HANDLE.GET
*
CASE TRUE
RESPONSE(RESPONSE.STATUS.ATTRIBUTE) = 405
RESPONSE(RESPONSE.CONTENT.ATTRIBUTE) = "Invalid type"
END CASE
*
RETURN
*
********************* S U B R O U T I N E *********************
*
HANDLE.GET:NULL
*
RESPONSE(RESPONSE.STATUS.ATTRIBUTE) = 200
RESPONSE(RESPONSE.CONTENT.ATTRIBUTE) = 'Hello, World!'
*
RETURN
*
* END OF PROGRAM
*
END
*
The subroutine can handle multiple request types and you can call further subroutines inside the request handler if need be.
The shape of all request handling routines will be similar to the above.
This is an example of how to get data from a form:
HANDLE.POST:NULL
*
FORM.DATA = REQUEST(REQUEST.FORM.ATTRIBUTE)
*
LOCATE("username",FORM.DATA,1;ANYPOS) THEN
USERNAME = FORM.DATA<2,ANYPOS>
END ELSE USERNAME = ""
*
LOCATE("password",FORM.DATA,1;ANYPOS) THEN
PASSWORD = FORM.DATA<2,ANYPOS>
END ELSE PASSWORD = ""
*
* ASSUME SESSION.ID WAS GENERATED HERE
*
RESPONSE(RESPONSE.STATUS.ATTRIBUTE) = 302
RESPONSE(RESPONSE.HEADERS.ATTRIBUTE)<-1> = "Location: /"
RESPONSE(RESPONSE.HEADERS.ATTRIBUTE)<-1> = "Set-Cookie: session_id=" : SESSION.ID
RESPONSE(RESPONSE.CONTENT.ATTRIBUTE) = ""
*
RETURN
*
The data from a form is available as 2 lists that are multivalued. The controlling list will contain the names of the field while the associated list will contain the values.
The controlling list is the first attribute of the FORM and the associated list is the second attribute of the FORM.
Query parameters are very similar to forms, there is a controlling list and an associated list. The only difference is that the data is found in the QUERY attribute of the request.
HANDLE.GET:NULL
*
LOCATE('query',REQUEST(REQUEST.QUERY.ATTRIBUTE)<1>,1;QUERY.POS) ELSE QUERY.POS = 1
QUERY = REQUEST(REQUEST.QUERY.ATTRIBUTE)<2,QUERY.POS>
*
* ASSUME RESULT WAS POPULATED WITH SEARCH RESULTS
*
RESPONSE(RESPONSE.STATUS.ATTRIBUTE) = 200
RESPONSE(RESPONSE.CONTENT.ATTRIBUTE) = RESULT
*
RETURN
*
URL slugs are handled in much the same way as the form data, there is a controlling list and an associated list. This data is found in the SLUG attribute.
HANDLE.GET:NULL
*
LOCATE('page',REQUEST(REQUEST.SLUGS.ATTRIBUTE)<1>,1;SLUG.POS) ELSE SLUG.POS = 1
PAGE.URL = REQUEST(REQUEST.SLUGS.ATTRIBUTE)<2,SLUG.POS>
*
* ASSUME THAT THE PAGE WAS RETRIEVED BASED ON PAGE.URL
*
RESPONSE(RESPONSE.STATUS.ATTRIBUTE) = 200
RESPONSE(RESPONSE.CONTENT.ATTRIBUTE) = RESULT
*
RETURN
You can also submit data with the Content-Type set to application/json and this will be send across in the JSON attribute with no processing down.
The REQUEST parameter will also have the full request in it so that if you need to do any custom processing that will be possible.