Helpers

Since you may want to do other things than reading and writing Agillic data, we have provided a number of helpers to do common operations.

In addition to the documentation here you can also download a javascript file which contains the full list of methods with documentation, this can also be loaded into your script to allow your editor to do auto completion when writing the script.

This has currently been tested on IntelliJ only.

To do that simply add this line in the top of your script:

load("file://<relevant_path_on_your_machine>/autocomplete.js");

Do not rename the file to anything other than autocomplete.js – loading the file will work and the result will be that all functions are overwritten with the blank version from the script. We have logic in our end to automatically remove any load of files call autocomplete before we execute the script, but that removal is tied to that particular name.

You can find some more relevant examples of the helpers described below in the examples section

The names listed below are the names they are exposed as – i.e for the below logger helper, you can simply do this in your script:

logger.info("That was easy!");

logger

Support for logging messages – standard logging setup so info, warn, error with a message.

Agillic will use the same logger when logging information at runtime (failures to parse data for given type, errors when executing the script, etc.), so you will have our information along with your data.

Logged statements are written to the WebDAV Log folder, in the extension.log file. We roll and zip the file once per hour and retain it for two weeks.

Some example usages:

logger.info("Start processing");
logger.info("Done");
logger.error("Failure");

And results:

The context contains:

  • extension: the id of the extension
  • recipient: the AGILLIC_ID of the recipient we run for
  • flow: the name of the flow
  • support: if the log message comes from us will usually have the support context so you have an idea which part of our code is logging to you

Do not make any assumptions regarding the log format, we may change it at some later stage. Do not assume the ordering of the fields in the context is fixed, it may very well change from log line to log line and is currently out of our control.

http

Support for making http calls, using the fetch format:

var response = fetch(url,request) call the given url using the given request, will return an object containing the response number statusCode and string body.

url is the full path to the endpoint you want to call, in string format.

request is a json object, as it has many more attributes which describe the more detailed aspects of the API call. Each attribute is described below.

Attribute Type Description
method String The http method to invoke
headers JSON Headers to be present in the call
body Unrestricted Body sent with call
timeout Number Milliseconds until call will timeout. Maximum 3000.
responseCharset String JSON responses by default should be UTF-8, but it is not required. If not present in response, the extension will default to Java’s default charset iso-8859-1
authProviderId String Reference to static or timeout based Authentication header, inserted at runtime so that your code doesn’t contain sensitive auth tokens. Contact Agillic Support to set up these tokens.

An example

The standalone jar is able to run this http helper towards the designated endpoint, or also run in a mock testing mode. As there is a lot that can be done with this http helper, we recommend looking at more examples.

cache

The cache helper allows you to do some basic caching inside JS extensions. Each type of cache we offer assists in achieving distinct use cases.

The three types of caches are:

  • Standard
  • Execution
  • Loading

Caches do not persist during deployments or restarts of your Agillic Instance. This should be taken into account during your implementation.

Standard & Execution Cache

The Standard Cache is the most basic cache offered, intended for use cases which require many entry keys, where write operations generally are simple operations such as incrementation, or simple cases where keys and their corresponding values are identical, and the cache is used to determine if an entry already exists in the cache.

The Execution Cache is a variant of the Standard Cache, in which the cache only persists within an execution context of a flow’s target group evaluation. As such, this cache type should only be used in Condition Extensions used on target groups referenced only for flow executions. Due to the way logic in conditions using this cache type typically work, the order in which conditions are evaluated could potentially change which recipients are filtered. Therefore in target groups where such conditions are present, conditions are evaluated from top to bottom, and you should intentionally place your extension condition to achieve the intended use case, and test to confirm. Incorrect placement can result in too few, or many recipients being filtered out. Behavior for this cache type in conditions applied on Promotions, Multi-Content blocks, Flow Steps, and other places is undefined.

The Standard and Execution Caches have identical input arguments, but are invoked with different functions:

  • The name argument, which is the cache’s name
  • The setting argument, which is a JSON object containing the maxSize attribute. The accepted range for maxSize is 1 to 2.000.000

If you want to access the same cache in several separate extensions, you must implement the initialization lines in each, using the same name and settings.

Standard and Execution Cache types provide several useful functions:

Function Description Return Value Example
_empty() Empties the cache. Note that it starts with an underscore. null example_standard_cache._empty();
putIfAbsent(key,value) Writes value to key, if not already present in cache null if able to write, the existing value if not example_standard_cache.putIfAbsent("foo","bar");
merge(key, default, function(current)); If entry empty, merge default as value. Otherwise, merge the result of the function. The function takes the current key’s value as an input argument. Works identical to the merge function for writing to Global Data Returns the value of the cache entry after the operation is complete var key = "count"; var default_value = 1; example_standard_cache.merge(key, default_value, function(current_value){ return current_value++; }));

Example

Find a more full example in the examples section

Loading cache

The Loading Cache is based on Google’s Guava Library. This cache self-populates with the return value of the provided function, and is intended to be written to infrequently, allowing it to be used to store values which may take some time to generate (such as an OAUTH token request), and where an expiration of that cache value is appropriate.

The Loading Cache has the following input arguments:

  • name: describing the name of the cache.
  • settings: where you can apply the below attributes to the cache:
Attribute Description
maxSize Defined maximum size of the cache, with accepted range from 1 to 2.000
duration Number value, used together with timeUnit to define the timeout limit. Default 30
timeUnit TimeUnit enum, used together with duration to describe the timeout limit. Defaults to TimeUnit.MINUTE. If combination of timeUnit and duration attributes exceed 12 hours, it will be lowered to to 12 hours. See info on TimeUnit in the dates helper.
expireOnWrite Type Boolean. If true (default), the expiration timeout is set when the cache is written to. If false, reading the cache entry resets the timeout.
  • function(key_name) which contains the code for how the cache should populate itself. The input argument is the key name.

You can also apply the _empty() function to this cache type, but as it is self-populating, this is probably not necessary.

Find a more full example in the examples section

dates

Support for working with dates, following methods are available – input date are converted the same way as for Person Data – we always return numbers (epoch millis):

  • add(date,amount,unit) – add the given amount of the given type (seconds, minutes, etc.) to the given date
  • distance(start,end,unit) – return the distance as an integer between the two dates in the given unit – the result is start – end, so the result is a positive number if start is after end, and negative otherwise
  • parse(string,format) – given a string and a format (Java time format), parse the string and return the date
  • format(time,format) – given a date and a format (Java time format), format the time using the format and return the string

The unit inputs are from TimeUnit:

TimeUnit.SECOND
TimeUnit.MINUTE
TimeUnit.HOUR
TimeUnit.DAY
TimeUnit.MONTH
TimeUnit.YEAR

encryption

Hashing and encryption support – currently only hashing:

We currently support the following hash algorithms: MD5, SHA1, SHA256, and the encodings: HEX, BASE64, URLSAFE_BASE64

  • hash(value,hash,encoding) – hash and then encode the given value.
  • hash(value,hash) – hash and encodes the given value using our default encoding (URLSAFE_BASE64)
  • hash(value) – hash and encodes the given value using our default hash and encoding (SHA256 + URLSAFE_BASE64)

environment

A few simple environment methods:

  • getEnvironment() – return either staging or production
  • getSolution() – return the full name of the solution e.g name-prod or name-stag

recipientService

This helper is only accessible in Http Extensions, to provide this extension type an equivalent of the recipient object, so you can read and write to Recipients.

  • recipientService.query(queryString): This function queries your Agillic recipient database, for recipients matching the query. It returns an array of recipient objects, which can be read from and written to, as described here, with all data (target groups included) preloaded for the found recipients. For best performance, we recommend only doing queries on indexed person data fields. The queryString follows our API Collection Filtering format.
  • recipientService.create(): This function returns a new blank recipient with the optional argument to populate the recipient with Person Data, in the format of a JSON object.

random

Methods for generating random strings:

  • randomAlphabetic(length) – return a random string of the requested length using only alphabetic chars.
  • randomAlphanumeric(length) – return a random string of the requested length using only alphabetic chars and numbers.