Run multiple Joulescopes smoothly with live feedback

I am using Ubuntu on a Raspberry Pi (I am also using Python3) and the goal is to run 6 Joulescopes. I realistically only need data for about 20 seconds and I would prefer to see a live feed of the data as well as have it saved for backup.

My issue is that I have downsample_logging.py and I adjusted it so the crash is minimal but sometimes it does still crash OR it will only find 4 or 5 out of 6 scopes OR it wont get the correct data. (I am also using the USB 3.0, so that is not one of the culprits). I think the main issue is that I do not need the file to be csv, I would prefer a .txt or sqlite (which I haven’t figured out how to connect to quite yet).

I have seen this executed through terminal where the person specifies the time to record data and a file name but I have not seen this done with multiple scopes. I assume I HAVE to write my own program and I cant just run it through what I downloaded off of the Joulescope site.

PS - I cannot get the Joulescope UI to work because Pyside2 is incompatible with the updated Python.

Hi @confused and welcome to the Joulescope forum!

The Rasberry Pi is quite special since it contains a unique graphics engine. The Raspberry PI 3 and older are all 32-bit, while the 4 can be either 32-bit or 64-bit making things even more interesting. I tried unsuccessfully to get the Joulescope UI running on a Raspberry Pi 4 a year ago. Here’s the GitHub issue. Perhaps the Joulescope UI will work now, but it’s not something we support. The less stressful approach is to use an Intel-based platform with Linux (Ubuntu), Windows, or macOS.

Now, the Raspberry Pi 4 is perfectly capable of running downsample_logging.py. I am not surprised that you are running into issues with more than 3 Joulescopes.

Unfortunately, USB 3 still limits you to 2 or 3 Joulescopes, depending upon the host system. USB 3 is actually two entirely separate things in one: a USB 2 “differential” pair (low-speed, full-speed, high-speed), and one or more new super-speed differential pairs. Joulescopes work on the USB 2 “differential” pair. Using a USB 3 port does nothing to change this, and the system is still limited to USB high-speed operation. USB 3 is a confusing mess, and the USB-IF is not helping.

Joulescopes always transmit full-rate data, unless you use the on-instrument statistics computation. With downsample_logging, you can pass the --source sensor option, and it will work with the 6 attached Joulescopes! The limitation is that the on-instruments statistics always run at 2 Hz. Note that downsample_logging will capture data from multiple connected Joulescopes.


Does this help? If you could provide more detail on exactly what you want to collect, I am happy to help figure out a way to accomplish it. Here are some questions:

  1. Is 2 Hz statistics sufficient?
  2. If not, what sampling rate do you want? Note that CSV, TXT or sqlite are all impractical for full-rate 2 Msps data. For example, Excel can only open 1,048,576 rows, or about 1/2 second.
  3. Is a “live feed” of text statistics printed to the console sufficient? Or do you really want a UI?
  4. What do you intend to do with this data? Are you analyzing it in real-time? Visually viewing it in realtime? Analyzing it offline with python? Something else?

I am running the Raspberry pi 4 with Ubuntu. Still no way of getting the UI because Pyside2 is incompatible with Python 3.9.

  1. 2 Hz stats should be okay
  2. The csv is how the data is already logging, I don’t know if a txt file would still be impractical with the low rate.
  3. A “live feed” on idle/terminal would be just fine, no need for a UI
  4. I need to grab one data point from about 3-5 seconds of data in real time. A backup of the data would be nice just in case though.

In that case, check out the statistics example. It’s much simpler than downsample_logging but still captures & displays the sensor (on-instrument) statistics from all connected Joulescopes. It should easily work with your 6 Joulescopes. You can modify handle_queue and/or statistics_callback to print and save the statistics however you would like.

For 2 Hz data, CSV, TXT, and sqlite would all work. If you want to use sqlite, see the sqlite3 python package.

Note that the statistics from each instrument are free-running. No guarantee on offset and the frequency of each is accurate to ±20ppm. However, the statistics example already handles resynchronizing the statistics messages to the main queue. You could cbk inside statistics_callback_factory to add a host-side timestamp, too.

Does this make sense?

Yes that script works a lot better, thank you!!

I am not too sure of how to modify these to get a .txt file.

This is a good tip, I will try this as well thank you.

I created a new statistics_logger.py example. This expands on the statistics.py example with configurable console output and output file logging. It should be even closer to what you want.

For help with the available command-line options, run with python statistics_logger.py --help.

What do you think? If you want anything not already in this script, let me know!

1 Like