Logging data while running other code

Hello, I would like to start logging data and then run some other python code, then stop logging data. So I would either want to start logging for say 100 second and then run my other code or start logging run the other code then stop logging.

The issue I am having is that when I start logging, the code either runs for x amount of seconds before I can run the other code or I have to use ctrl + c before the other code runs. Would you mind showing me an example on how to get around this problem?

Hi @Xylophone and welcome to the Joulescope forum!

I have a few questions:

  1. Can you clarify what you mean by “logging data”? Joulescopes can provide two types of data:
    a. Streaming sample data which is normally recorded to JLS files
    b. Streaming statistics data, which can be recorded to CSV files.
  2. Are you using a Joulescope JS110 or JS220?
  3. Do you care how you access the Joulescope devices through Python? You can use the pyjoulescope package or directly use the newer underlying pyjoulescope_driver.
  1. Right now I am using the downsample_logging.py script, so it would be statistics data. I am mainly measuring the time and current down to nA range.
  2. I am using JS220
  3. I do not think I care. Are there any differences? Right now I am using the pyjoulescope package.

How about something like this?

#!/usr/bin/env python3
# -*- coding: utf-8 -*-

# Copyright 2019-2024 Jetperch LLC
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#     http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# See the License for the specific language governing permissions and
# limitations under the License.

"""Display statistics from the first connected Joulescope."""

from joulescope import scan
import time

def statistics_callback(stats):
    """The function called for each statistics.

    :param stats: The statistics data structure.
    t = stats['time']['range']['value'][0]
    i = stats['signals']['current']['µ']
    v = stats['signals']['voltage']['µ']
    p = stats['signals']['power']['µ']
    c = stats['accumulators']['charge']
    e = stats['accumulators']['energy']

    fmts = ['{x:.9f}', '{x:.3f}', '{x:.9f}', '{x:.9f}', '{x:.9f}']
    values = []
    for k, fmt in zip([i, v, p, c, e], fmts):
        value = fmt.format(x=k['value'])
        value = f'{value} {k["units"]}'
    ', '.join(values)
    print(f"{t:.1f}: " + ', '.join(values))

def run():
    devices = scan(config='off')
    if not len(devices):
        print('No Joulescope device found')
        return 1
    device = devices[0]
        device.parameter_set('i_range', 'auto')
        device.parameter_set('v_range', '15V')
        time.sleep(10)   # replace with your code

if __name__ == '__main__':

I will take a look at the body of the statistics_callback function.

I managed to make it work by using multiple processes like so:

import threading
import subprocess

def run_script(script_args):
    subprocess.run(["python"] + script_args.split())

if __name__ == "__main__":
    script1_args = "downsample_logging.py --time 100 --frequency 100"
    script2_args = "current_leakage_test.py"

    script1_thread = threading.Thread(target=run_script, args=(script1_args,))
    script2_thread = threading.Thread(target=run_script, args=(script2_args,))



Hi @Xylophone - Yes, using python to run two processes can definitely work.

If you want a single, integrated python script, the example I provided is a good start. You can modify the statistics_callback function to do whatever you want with the statistics. If you also want to maintain state, like an open file handle, you could convert this to a class with a __call__ method.

Does this give you what you want?

1 Like