Learning about coding secure-scuttlebutt apps

This post, despite posting it first elsewhere, was inspired by the principles of open learning, so I am reposting it here :slight_smile:


I want to write a little thing about my very first real SSB hack day the other day.

Goal: to write a minimal SSB thingy that could do two things:

  1. send private messages
  2. receive new private messages

There is no requirement to fetch past sessions or conversations.

Here’s something like how things went…

I didn’t know whether to use ssb-client or ssb-server? At first I tried running ssb-client, and it looked promising, then something made me realize that it was connecting to my open Patchwork ssb-server instance.


So then I realized I definitely needed ssb-server because I want this to be standalone… can spin it up on a completely new system easily, and probably independent of other ssb apps.

I think it was in the readme of ssb-server that I hit some confusion… it has two chunks of code, one for ssb-server, and then says “elsewhere:” and shows another code snippet with ‘ssb-client’. This made me unsure whether I NEEDED ssb-client, or whether it was just an option.
See the bit about ‘example usage’ on ssb-server README.

Using ssb-server, there were a lot of references to plugins. This was a sore spot.
I couldn’t find a good reference for which plugins existed nor what they did. If someone has such a reference, I would still love to see it. I could see examples of ssb plugins in code samples, but somehow that didn’t feel helpful. It seemed a bit arbitrary.
There is this resource that references plugins, but it felt it went deeper than I needed on one particular plugin, so I mostly skipped over it.
https://josiahwitt.com/2018/07/08/scuttlebutt-intro-flume.html

Wouldn’t this page be a good one to put plugins into?
https://www.scuttlebutt.nz/modules

It was explained to me here


that setting caps.shs and caps.sign I could work off a test network, and my experiments wouldn’t reach the main SSB network. Fairly easy to grasp and get started with.
One thing I’m still unsure of is port in the config. Port for what?

Overall, this ‘field guide’ does a nice job of what I think it intends to, a quick and dirty orientation.

I created ~/.ssb-test empty folder. I created config file, and dropped in:

{
  "caps": {
    "shs": "g4rJuU2yHGOcwc8OxrZDaGpmhHAt9r3fenaTIaRfSJo=",
    "sign": "JRjK44nw1miRqP65nHCV3qfEFjdpWVoBv7jyJYjN/UQ="
  },
  "port": 8007,
  "ws": {
    "port": 8988
  },
  "ssb_appname": "ssb-test"
}

I generated the shs (secret handshake) and sign values the recommended way, in a node “repl” (just type node) and then run crypto.randomBytes(32).toString('base64') twice for the two new values.

I am STILL not sure whether I actually needed to perform this step: export ssb_appname="ssb-test" in the terminal.

Before going for private messages, I went for unencrypted ones. For this, all I needed was some sample code from the ssb-server readme, now that I resolved that the same api available on ssb-client would be available on the ssb-server instance itself directly.

The server code could be pared down to:

const Server = require('ssb-server')
const Config = require('ssb-config/inject')
// ssb-test matches the name of ~/.ssb-test folder where the rest of the `config` is
const ssbconfig = Config('ssb-test')

Server
    .use(require('ssb-gossip'))
    .use(require('ssb-replicate'))
    .use(require('ssb-private'))

const server = Server(ssbconfig)

Once you’ve got the server, you can start to play. First, write a message, a-la ssb-server readme.

// publish a message
  server.publish({ type: 'post', text: 'My First Post!' }, function (err, msg) {
    // msg.key           == hash(msg.value)
    // msg.value.author  == your id
    // msg.value.content == { type: 'post', text: 'My First Post!' }
    // ...
  })

This worked fine.

Next, can I read a message. This gets me into the territory that made it daunting to work with ssb. pull-streams.

I see this code in the readme, and I don’t know why it’s done this way or what it means. I have a slight idea from prior reading.

const pull = require('pull-stream')
// ...

// stream all messages in all feeds, ordered by publish time
  pull(
    server.createFeedStream(),
    pull.collect(function (err, msgs) {
      // msgs[0].key == hash(msgs[0].value)
      // msgs[0].value...
    })
  )

For me, it is problematic at this point to not know what ‘pull.collect’ is/does, because what I need is a running (ongoing) stream, and I don’t know whether this does that.

As it turns out, by testing with it, I realize that it doesn’t. So I need to review more closely, what is pull, and what is pull.collect. For this, it’s time to find pull-streams docs.

I think, somehow or other that I can’t remember now, this ended being the pull-streams resource that I found the fastest, which seems possibly problematic, since isn’t this the outdated documentation?
https://scuttlebot.io/apis/pull-stream/pull-stream.html
Long story short, source, through and sink descriptions on their pages helped… a lot of the intro and pre-amble is to philosophical.
“You must have a source at the start of a pipeline for data to move through.”
“A Through is a stream that both reads and is read by another stream. Through streams are optional. Put through streams in-between sources and sinks”
“You must have a sink at the end of a pipeline for data to move towards. You can only use one sink per pipeline”

Now I see this is a more comprehensive source of docs for it, though still intimidating: https://pull-stream.github.io/

This line kiinda explains the pull function… “pull(a, b, c) is basically the same as a.pipe(b).pipe©”

So at this point, it’s obvious that server.createFeedStream() is creating a source.
And pull.collect is creating a sink. What kind of sink?
This page states: https://scuttlebot.io/apis/pull-stream/core-sinks.html
pull.collect(cb): Read the stream into an array, then callback

I discovered what I wanted on that page, even though I still kinda don’t get the description…
drain (op?, done?)
“Drain the stream, calling op on each data. call done when stream is finished. If op returns ===false, abort the stream”

A simple way to test, found on that page was:

const pull = require('pull-stream')
// ...

// stream all messages in all feeds, ordered by publish time
// log each to the console

  pull(
    server.createFeedStream(),
    pull.log()
  )

pull.drain is really what I want to do something functional though, keeping the stream flowing:

pull(
    server.createFeedStream(),
    pull.drain((msg) => {
      // do whatever I like with the message
    })
)

Pretty big time investment gone into this simple goal. But it comes together, and definitely shows promise of its value in other more contexts. I am still interested in learning more about the various source and through functions in pull-streams. Obviously some classic functional programming concepts in there like filter and map.

Just running the file like node index.js over and over again with my testing at this point.

Have now successfully written and read messages, time to go encrypted.
This one was a bit obvious that I’d need ssb-private module/plugin installed. Up to this point everything could work just fine without it. This page was a good guide:
https://www.scuttlebutt.nz/guides/ssb-server/publish-encrypted-messages

Now we have ssb.private to work with. We can replace both the read and write functions (I only learned after about replacing the read).

For read, replace server.createFeedStream() with server.private.read({})
I think it errored out without an empty object, this seems like a buggy behaviour.

Docs for ssb-private are minimal: https://github.com/ssbc/ssb-private
About read it says:
"read(opts) (sync) Returns a stream of private messages. Takes query options similar to ssb-query."
ssb-query links to: https://github.com/dominictarr/ssb-query
That link is oddly misleading, at least unless you dig a layer deeper.
https://github.com/ssbc/ssb-query#queryread-querylimitreverseoldlive This gives a hint, and follow another link down “see createLogStream” to get useful docs:
https://github.com/ssbc/ssb-db#ssbdbcreatelogstreamltltegtgte-timestamp-reverseoldliveraw-boolean-limit-number--pullsource

Most useful to me was the bit about live and old option keys.
gt, gte, lt, lte ranges are supported, via ltgt if reverse is set to true, results will be from oldest to newest. if limit is provided, the stream will stop after that many items. old and live return wether to include old and live (newly written messages) as via pull-live

I didn’t want old, but did want live so I updated server.private.read({}) to server.private.read({ old: false, live: true }).

This left me with two questions:

  1. did server.private.read dump me ALL encrypted messages of ANYONES? (still encrypted of course)
  2. did server.private.read decrypt the messages that were for me?

Through testing, and asking (thanks @cel ) I discovered 1. no, and 2. yes

This would be helpful to put in the docs, explaining this. I did a check in the source code and this point and got mystified by a flume query. https://github.com/ssbc/ssb-private/blob/master/index.js#L24-L29

Once again, writing a message was easier: ssb-private docs + https://www.scuttlebutt.nz/guides/ssb-server/publish-encrypted-messages were enough easily.
From what I know, you can’t publish encrypted messages from the CLI, so it’s confusing for this to be under ‘Command Line Client’ section in the scuttlebutt.nz guides. Maybe it could come out.

To create a function that takes a string/message and a recipient, the ssb code would be:

function send(stringMessage, id) {
  server.private.publish(
     {
        type: 'post',
        text: stringMessage,
        recps: [{ link: id }]
      },
      [id], // those to encrypt for
      (err, msg) => {}
  )
}

This would mean not including encrypting it for “yourself” as the bot. To do that, you would just add the hash/address/id of the server/bot itself to the [] type elements, in the same format as the recipient. I just didn’t/don’t want to do that.

There we go that’s my write up this took me longer than I wanted already I’m done goodnight :slight_smile:

So a stripped down version of what I came up with is…

const pull = require('pull-stream')
const Server = require('ssb-server')
const Config = require('ssb-config/inject')
// ssb-test matches the name of ~/.ssb-test folder where the rest of the `config` is
const ssbconfig = Config('ssb-test')

Server
    .use(require('ssb-gossip'))
    .use(require('ssb-replicate'))
    .use(require('ssb-private'))

const server = Server(ssbconfig)

pull(
    server.private.read({ old: false, live: true }), // don't bother with old messages, just stream new ones
    pull.drain((msg) => {
      // already decrypted :)
      // this is the part where I receive incoming messages
      console.log(`receiving a message from ${msg.value.author}`)
      console.log(`message content: ${msg.value.content.text}`)
    })
  )

// this is a function where I could send private messages
function send(stringMessage, id) {
  server.private.publish({
        type: 'post',
        text: stringMessage,
        recps: [{ link: id }]
      },
      [id], // those to encrypt for
      (err, msg) => {})
}

package.json dependencies

    "pull-stream": "^3.6.14",
    "ssb-config": "^3.4.2",
    "ssb-gossip": "^1.1.1",
    "ssb-private": "^0.2.3",
    "ssb-replicate": "^1.3.0",
    "ssb-server": "^15.1.2"

Followups:
I think it will still be useful to be able to customize which folder it points to for the config, and storing of files, which can be tweaked with ssb-config.

Fuller code, in context:

The original post is here: https://viewer.scuttlebot.io/%EpUhmjQ2Y5CLDRw8kYGnWxc6MHiPTWDgTG17jfchMBk%3D.sha256 which also includes some interesting follow up comments.

Also, this has led to me submitting a pull request to the scuttlebutt.nz documentation.

Creative Commons Licence Contributions to the Open Learning Commons are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Please honor the spirit of collective open learning by citing the author(s) in the context of a dialogue and/or linking back to the original source.