Extend and supercharge your DynamoDB DocumentClient with promises, retries, and more.
API Documentation (minus the callbacks, of course)
npm i dynamo-plusconst { DynamoPlus } = require('dynamo-plus')
const documentClient = DynamoPlus({
region: 'eu-west-1',
})
const regularDynamoParams = {
TableName: 'myTable',
Key: {
myKey: '1337'
}
}
const data = await documentClient.get(regularDynamoParams)- automatically appends .promise()
- automatically retries and backs off when you get throttled
- new methods for performing batchWrite requests in chunks
- new methods for query operations
- new methods for scan operations
The DynamoPlus client will automatically append .promise() for you, making all methods awaitable by default.
When the client is instantiated, the original methods are prefixed and accessible through e.g. original_${method}
Whenever a query fails for reasons such as LimitExceededException the promise will reboot behind the scenes so that you don't have to worry about it.
For information about retryable exceptions, see https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Programming.Errors.html#Programming.Errors.MessagesAndCodes
If you want to use a delay from the beginning, set lastBackOff to a millisecond value in the query params.
batchWrite is neat for inserting multiple documents at once, but it requires you to handle chunking and unprocessed items yourself, while also using it's own somewhat unique syntax. We've added deleteAll() and putAll() to do the heavy lifting for you.
batchWrite deletions, but with the simple syntax of delete.
- params
Object- TableName
- Keys - An array of key objects equivalent to
Keyin delete(). - BatchSize - Optional custom batch size. Defaults to 25 which the maximum permitted value by DynamoDB.
deleteAll() does not return any data once it resolves.
const params = {
TableName: 'Woop woop!',
Keys: [{ userId: '123' }, { userId: 'abc' }]
}
await documentClient.deleteAll(params)batchWrite upserts, but with the simple syntax of put.
- params
Object- TableName
- Items - An array of documents equivalent to
Itemin put(). - BatchSize - Optional custom batch size. Defaults to 25 which the maximum permitted value by DynamoDB.
putAll() does not return any data once it resolves.
const params = {
TableName: 'Woop woop!',
Items: [{ a: 'b' }, { c: 'd' }]
}
await documentClient.putAll(params)Query has new sibling methods that automatically paginate through resultsets for you.
Resolves with the entire array of matching items.
const params = {
TableName : 'items',
IndexName: 'articleNo-index',
KeyConditionExpression: 'articleNo = :val',
ExpressionAttributeValues: { ':val': articleNo }
}
const response = await documentClient.queryAll(params)
// response now contains ALL items with the articleNo, not just the first 1MBLike scanStream, but for queries.
Like scanStreamSync, but for queries.
We've supercharged scan() for those times when you want to recurse through entire tables.
Resolves with the entire array of matching items.
const params = {
TableName : 'MyTable',
FilterExpression : 'Year = :this_year',
ExpressionAttributeValues : {':this_year' : 2015}
}
const response = await documentClient.scanAll(params)
// response now contains ALL documents from 2015, not just the first 1MBAn EventEmitter-driven approach to recursing your tables. This is a powerful tool when you have datasets that are too large to keep in memory all at once.
To spread out the workload across your table partitions you can define a number of parallelScans. DynamoPlus will automatically keep track of the queries and emit a single done event once they all complete.
Note: scanStream() does not care whether your event listeners finish before it requests the next batch. (It will, however, respect throttling exceptions from DynamoDB.) If you want to control the pace, see scanStreamSync.
- params - AWS.DynamoDB.DocumentClient.scan() parameters
- parallelScans - integer|array (Default: 1) Amount of segments to split the scan operation into. It also accepts an array of individual segment options such as LastEvaluatedKey, the length of the array then decides the amount of segments.
The returned EventEmitter emits the following events:
data- Raw response from each scanitems- An array with documentsdone- Emitted once there are no more documents scanerror
const params = {
TableName : 'MyTable'
}
const emitter = documentClient.scanStream(params)
emitter.on('items', async (items) => {
console.log(items)
})Like scanStream(), but will not proceed to request the next batch until all eventlisteners have returned a value (or resolved, if they return a Promise).
- params - AWS.DynamoDB.DocumentClient.scan() parameters
- parallelScans - integer|array (Default: 1) Amount of segments to split the scan operation into. It also accepts an array of individual segment options such as LastEvaluatedKey, the length of the array then decides the amount of segments.
The returned EventEmitter emits the following events:
data- Raw response from each scanitems- An array with documentsdone- Emitted once there are no more documents scanerror
const params = {
TableName : 'MyTable'
}
const emitter = documentClient.scanStreamSync(params)
emitter.on('items', async (items) => {
// Do something async with the documents
return Promise.all(items.map((item) => sendItemToSantaClaus(item)))
// Once the Promise.all resolves, scanStreamSync() will automatically request the next batch.
})aws-sdk is set as a dev-dependency since it is pretty large and installed by default on AWS Lambda.
They are all available with an original_ prefix:
const { DynamoPlus } = require('dynamo-plus')
const documentClient = DynamoPlus()
documentClient.original_get(myParams, (err, data) => {})
// or
documentClient.original_get(myParams).promise()Automatic retries don't apply when calling original methods directly.
That's a statement, but I see your point.