Recording Experiment Workflow ResultsĀ¶
While running an experiment workflow one would like to keep a record of what took place -- a kind of digital lab book. The LabOne Q Applications Library provides logbooks for just this task.
Each workflow run creates its own logbook. The logbook records the tasks being run and may also be used to store additional data such as device settings, LabOne Q experiments, qubits, and the results of experiments and analyses.
Logbooks need to be stored somewhere, and within the Applications Library, this place is called a logbook store.
Currently the Applications Library supports two kinds of stores:
FolderStore
LoggingStore
The FolderStore
writes logbooks to a folder on disk. It is used to keep a permanent record of the experiment workflow.
The LoggingStore
logs what is happening using Python's logging. It provides a quick overview of the steps performed by a workflow.
We'll look at each of these in more detail shortly, but first let us set up a quantum platform to run some experiments on so we have something to record.
Setting up a quantum platformĀ¶
Build your LabOne Q DeviceSetup
, qubits and Session
as normal. Here we import a demonstration tunable transmon quantum platform from the library and the amplitude Rabi experiment:
import numpy as np
from laboneq.simple import *
from laboneq_applications.experiments import amplitude_rabi
from laboneq_applications.qpu_types.tunable_transmon import demo_platform
# Create a demonstration QuantumPlatform for a tunable-transmon QPU:
qt_platform = demo_platform(n_qubits=6)
# The platform contains a setup, which is an ordinary LabOne Q DeviceSetup:
setup = qt_platform.setup
# And a tunable-transmon QPU:
qpu = qt_platform.qpu
# Inside the QPU, we have qubits, which is a list of six LabOne Q Application
# Library TunableTransmonQubit qubits:
qubits = qpu.qubits
session = Session(setup)
session.connect(do_emulation=True)
[2024.12.19 17:11:41.221] INFO Logging initialized from [Default inline config in laboneq.laboneq_logging] logdir is /builds/qccs/laboneq-applications/docs/sources/tutorials/sources/laboneq_output/log
[2024.12.19 17:11:41.224] INFO VERSION: laboneq 2.43.0
[2024.12.19 17:11:41.225] INFO Connecting to data server at localhost:8004
[2024.12.19 17:11:41.227] INFO Connected to Zurich Instruments LabOne Data Server version 24.10 at localhost:8004
[2024.12.19 17:11:41.229] INFO Configuring the device setup
[2024.12.19 17:11:41.265] INFO The device setup is configured
<laboneq.dsl.session.ConnectionState at 0x743ea4818410>
The LoggingStoreĀ¶
When you import the laboneq_applications
library it automatically creates a default LoggingStore
for you. This logging store is used whenever a workflow is executed and logs information about:
- the start and end of workflows
- the start and end of tasks
- any errors that occur
- comments (adhoc messages from tasks, more on these later)
- any data files that would be saved if a folder store was in use (more on these later too)
These logs don't save anything on disk, but they will allow us to see what events are recorded and what would be saved if we did a have a folder store active.
An example of loggingĀ¶
Let's run the amplitude Rabi experiment and take a look:
amplitudes = np.linspace(0.0, 0.9, 10)
options = amplitude_rabi.experiment_workflow.options()
options.count(10)
options.averaging_mode(AveragingMode.CYCLIC)
rabi_tb = amplitude_rabi.experiment_workflow(
session,
qpu,
qubits[0],
amplitudes,
options=options,
)
The workflow has not yet been executed, but when you run the next cell, you should see messages like:
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Workflow 'amplitude_rabi': execution started
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
appear in the logs beneath the cell.
result = rabi_tb.run()
[2024.12.19 17:11:41.282] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.282] INFO Workflow 'amplitude_rabi': execution started at 2024-12-19 17:11:41.281679Z
[2024.12.19 17:11:41.283] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.283] INFO Task 'temporary_modify': started at 2024-12-19 17:11:41.283600Z
[2024.12.19 17:11:41.284] INFO Task 'temporary_modify': ended at 2024-12-19 17:11:41.284653Z
[2024.12.19 17:11:41.285] INFO Task 'create_experiment': started at 2024-12-19 17:11:41.285227Z
[2024.12.19 17:11:41.287] INFO Task 'create_experiment': ended at 2024-12-19 17:11:41.287687Z
[2024.12.19 17:11:41.288] INFO Task 'compile_experiment': started at 2024-12-19 17:11:41.288291Z
[2024.12.19 17:11:41.295] INFO Resolved modulation type of oscillator 'q0_readout_acquire_osc' on signal '/logical_signal_groups/q0/acquire' to SOFTWARE
[2024.12.19 17:11:41.295] INFO Resolved modulation type of oscillator 'q0_drive_ge_osc' on signal '/logical_signal_groups/q0/drive' to HARDWARE
[2024.12.19 17:11:41.295] INFO Resolved modulation type of oscillator 'q0_drive_ef_osc' on signal '/logical_signal_groups/q0/drive_ef' to HARDWARE
[2024.12.19 17:11:41.296] INFO Starting LabOne Q Compiler run...
[2024.12.19 17:11:41.304] INFO Schedule completed. [0.006 s]
[2024.12.19 17:11:41.329] INFO Code generation completed for all AWGs. [0.025 s]
[2024.12.19 17:11:41.330] INFO Completed compilation step 1 of 1. [0.032 s]
[2024.12.19 17:11:41.334] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.334] INFO Device AWG SeqC LOC CT entries Waveforms Samples
[2024.12.19 17:11:41.334] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.335] INFO device_hdawg 0 4 1 0 0
[2024.12.19 17:11:41.335] INFO device_shfqc 0 17 0 1 8000
[2024.12.19 17:11:41.335] INFO device_shfqc_sg 0 35 11 2 448
[2024.12.19 17:11:41.336] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.336] INFO TOTAL 56 12 8448
[2024.12.19 17:11:41.336] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.341] INFO Finished LabOne Q Compiler run.
[2024.12.19 17:11:41.343] INFO Task 'compile_experiment': ended at 2024-12-19 17:11:41.343458Z
[2024.12.19 17:11:41.344] INFO Task 'run_experiment': started at 2024-12-19 17:11:41.344051Z
[2024.12.19 17:11:41.352] INFO Starting near-time execution...
[2024.12.19 17:11:41.363] INFO Finished near-time execution.
[2024.12.19 17:11:41.364] INFO Task 'run_experiment': ended at 2024-12-19 17:11:41.364696Z
[2024.12.19 17:11:41.365] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.366] INFO Workflow 'analysis_workflow': execution started at 2024-12-19
[2024.12.19 17:11:41.366] INFO 17:11:41.365500Z
[2024.12.19 17:11:41.367] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.367] INFO Task 'calculate_qubit_population': started at 2024-12-19 17:11:41.367419Z
[2024.12.19 17:11:41.368] INFO Task 'calculate_qubit_population': ended at 2024-12-19 17:11:41.368263Z
[2024.12.19 17:11:41.369] INFO Task 'fit_data': started at 2024-12-19 17:11:41.368869Z
[2024.12.19 17:11:41.750] INFO Task 'fit_data': ended at 2024-12-19 17:11:41.749719Z
[2024.12.19 17:11:41.750] INFO Task 'extract_qubit_parameters': started at 2024-12-19 17:11:41.750552Z
[2024.12.19 17:11:41.752] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.12.19 17:11:41.752] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.12.19 17:11:41.753] INFO Task 'extract_qubit_parameters': ended at 2024-12-19 17:11:41.753444Z
[2024.12.19 17:11:41.754] INFO Task 'plot_raw_complex_data_1d': started at 2024-12-19 17:11:41.754151Z
[2024.12.19 17:11:41.781] INFO Artifact: 'Raw_data_q0' of type 'Figure' logged at 2024-12-19 17:11:41.780909Z
[2024.12.19 17:11:41.781] INFO Task 'plot_raw_complex_data_1d': ended at 2024-12-19 17:11:41.781626Z
[2024.12.19 17:11:41.782] INFO Task 'plot_population': started at 2024-12-19 17:11:41.782292Z
[2024.12.19 17:11:41.794] INFO Artifact: 'Rabi_q0' of type 'Figure' logged at 2024-12-19 17:11:41.794236Z
[2024.12.19 17:11:41.795] INFO Task 'plot_population': ended at 2024-12-19 17:11:41.794969Z
[2024.12.19 17:11:41.795] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.796] INFO Workflow 'analysis_workflow': execution ended at 2024-12-19 17:11:41.795607Z
[2024.12.19 17:11:41.796] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.797] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.797] INFO Workflow 'amplitude_rabi': execution ended at 2024-12-19 17:11:41.797224Z
[2024.12.19 17:11:41.798] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
And that's all there is to the basic logging functionality.
Advanced logging usesĀ¶
If you need to create a logging store of your own you can do so as follows:
from laboneq.workflow.logbook import LoggingStore
logging_store = LoggingStore()
The logging store created above won't be active unless you run:
logging_store.activate()
And you deactivate it with:
logging_store.deactivate()
You can access the default logging store by importing it from laboneq.workflow.logbook
:
from laboneq.workflow.logbook import DEFAULT_LOGGING_STORE
DEFAULT_LOGGING_STORE
<laboneq.workflow.logbook.logging_store.LoggingStore at 0x743dfb164ef0>
You can also inspect all the active logbook stores:
from laboneq.workflow.logbook import active_logbook_stores
active_logbook_stores()
[]
The FolderStoreĀ¶
Using the folder storeĀ¶
The FolderStore
saves workflow results on disk and is likely the most important logbook store you'll use.
You can import it as follows:
from laboneq.workflow.logbook import FolderStore
To create a folder store you'll need to pick a folder on disk to store logbooks in. Here we select ./experiment_store
as the folder name but you should pick your own.
Each logbook created by a workflow will have its own sub-folder. The sub-folder name will start with a timestamp, followed by the name of the workflow, for example 20240728T175500-amplitude-rabi/
. If necessary, a unique count will be added at the end to make the sub-folder name unique.
The timestamps are in UTC, so they might be offset from your local time, but will be meaningful to users in other timezones and will remain correctly ordered when changing to or from daylight savings.
The folder store will need to be activated before workflows will use it automatically.
folder_store = FolderStore("./experiment_store")
folder_store.activate()
Now let's run the amplitude Rabi workflow. As before we'll see the task events being logged. Afterwards we'll explore the folder to see what has been written to disk.
result = rabi_tb.run()
[2024.12.19 17:11:41.900] INFO Resolved modulation type of oscillator 'q0_readout_acquire_osc' on signal '/logical_signal_groups/q0/acquire' to SOFTWARE
[2024.12.19 17:11:41.900] INFO Resolved modulation type of oscillator 'q0_drive_ge_osc' on signal '/logical_signal_groups/q0/drive' to HARDWARE
[2024.12.19 17:11:41.900] INFO Resolved modulation type of oscillator 'q0_drive_ef_osc' on signal '/logical_signal_groups/q0/drive_ef' to HARDWARE
[2024.12.19 17:11:41.901] INFO Starting LabOne Q Compiler run...
[2024.12.19 17:11:41.909] INFO Schedule completed. [0.006 s]
[2024.12.19 17:11:41.932] INFO Code generation completed for all AWGs. [0.023 s]
[2024.12.19 17:11:41.932] INFO Completed compilation step 1 of 1. [0.029 s]
[2024.12.19 17:11:41.936] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.937] INFO Device AWG SeqC LOC CT entries Waveforms Samples
[2024.12.19 17:11:41.937] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.937] INFO device_hdawg 0 4 1 0 0
[2024.12.19 17:11:41.938] INFO device_shfqc 0 17 0 1 8000
[2024.12.19 17:11:41.938] INFO device_shfqc_sg 0 35 11 2 448
[2024.12.19 17:11:41.938] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.939] INFO TOTAL 56 12 8448
[2024.12.19 17:11:41.939] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.12.19 17:11:41.943] INFO Finished LabOne Q Compiler run.
[2024.12.19 17:11:41.965] INFO Starting near-time execution...
[2024.12.19 17:11:41.974] INFO Finished near-time execution.
[2024.12.19 17:11:42.018] WARNING Type <class 'dict'> not supported by the serializer [name: calculate_qubit_population.output.q0].
[2024.12.19 17:11:42.019] WARNING Type <class 'dict'> not supported by the serializer [name: fit_data.input.processed_data_dict].
[2024.12.19 17:11:42.410] WARNING Type <class 'lmfit.model.ModelResult'> not supported by the serializer [name: fit_data.output.q0].
[2024.12.19 17:11:42.411] WARNING Type <class 'dict'> not supported by the serializer [name: extract_qubit_parameters.input.processed_data_dict].
[2024.12.19 17:11:42.411] WARNING Type <class 'dict'> not supported by the serializer [name: extract_qubit_parameters.input.fit_results].
[2024.12.19 17:11:42.651] WARNING Type <class 'dict'> not supported by the serializer [name: plot_population.input.processed_data_dict].
[2024.12.19 17:11:42.651] WARNING Type <class 'dict'> not supported by the serializer [name: plot_population.input.fit_results].
If you no longer wish to automatically store workflow results in the folder store, you can deactivate it with:
folder_store.deactivate()
Exploring what was written to diskĀ¶
Here we will use Python's pathlib
functionality to explore what has been written to disk, but you can also use whatever ordinary tools you prefer (terminal, file navigator).
import json
from pathlib import Path
Remember that above we requested that the folder store use a folder named experiment_store
. Let's list the logbooks that were created in that folder:
store_folder = Path("experiment_store")
amplitude_rabi_folders = sorted(store_folder.glob("*/*-amplitude-rabi"))
Our amplitude Rabi experiment is the most recent one run, so let's look at the files within the most recent folder. Note that the logbook folder names start with a timestamp followed by the name of the workflow, which allows us to easily order them by time and to find the workflow we're looking for:
amplitude_rabi_folder = amplitude_rabi_folders[-1]
amplitude_rabi_files = sorted(amplitude_rabi_folder.iterdir())
amplitude_rabi_files
[PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/Rabi-q0.png'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/Raw-data-q0.png'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/amplitude-rabi.input.amplitudes.npy'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/amplitude-rabi.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/amplitude-rabi.input.qpu.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/amplitude-rabi.input.qubits.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/analysis-workflow.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/calculate-qubit-population.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/compile-experiment.output.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/create-experiment.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/create-experiment.output.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/extract-qubit-parameters.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/fit-data.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/log.jsonl'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/plot-population.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/plot-raw-complex-data-1d.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/run-experiment.input.options.json'), PosixPath('experiment_store/20241219/20241219T171141-amplitude-rabi/run-experiment.output.json')]
Let us look at the file log.jsonl
. This is the log of what took place. The log is stored in a format called "JSONL" which means each line of the log is a simple Python dictionary stored as JSON. Larger objects and certain types of data are stored as separate files in a subfolder called obj
or, for some important data, in the same folder.
Let's open the file and list the logs:
experiment_log = amplitude_rabi_folder / "log.jsonl"
logs = [json.loads(line) for line in experiment_log.read_text().splitlines()]
logs
[{'event': 'start', 'workflow': 'amplitude_rabi', 'index': [], 'time': '2024-12-19 17:11:41.831733+00:00', 'input': {'session': '...', 'qpu': [{'filename': 'amplitude-rabi.input.qpu.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'temporary_parameters': None, 'options': [{'filename': 'amplitude-rabi.input.options.json'}]}}, {'event': 'task_start', 'task': 'temporary_modify', 'index': [], 'time': '2024-12-19 17:11:41.879251+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'temporary_parameters': None}}, {'event': 'task_end', 'task': 'temporary_modify', 'index': [], 'time': '2024-12-19 17:11:41.879337+00:00', 'output': [{'filename': 'amplitude-rabi.input.qubits.json'}]}, {'event': 'task_start', 'task': 'create_experiment', 'index': [], 'time': '2024-12-19 17:11:41.879404+00:00', 'input': {'qpu': [{'filename': 'amplitude-rabi.input.qpu.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'create-experiment.input.options.json'}]}}, {'event': 'task_end', 'task': 'create_experiment', 'index': [], 'time': '2024-12-19 17:11:41.888149+00:00', 'output': [{'filename': 'create-experiment.output.json'}]}, {'event': 'task_start', 'task': 'compile_experiment', 'index': [], 'time': '2024-12-19 17:11:41.894435+00:00', 'input': {'session': '...', 'experiment': [{'filename': 'create-experiment.output.json'}], 'compiler_settings': None}}, {'event': 'task_end', 'task': 'compile_experiment', 'index': [], 'time': '2024-12-19 17:11:41.945701+00:00', 'output': [{'filename': 'compile-experiment.output.json'}]}, {'event': 'task_start', 'task': 'run_experiment', 'index': [], 'time': '2024-12-19 17:11:41.959506+00:00', 'input': {'session': '...', 'compiled_experiment': [{'filename': 'compile-experiment.output.json'}], 'options': [{'filename': 'run-experiment.input.options.json'}]}}, {'event': 'task_end', 'task': 'run_experiment', 'index': [], 'time': '2024-12-19 17:11:41.975339+00:00', 'output': [{'filename': 'run-experiment.output.json'}]}, {'event': 'start', 'workflow': 'analysis_workflow', 'index': [], 'time': '2024-12-19 17:11:41.976329+00:00', 'input': {'result': [{'filename': 'run-experiment.output.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'analysis-workflow.input.options.json'}]}}, {'event': 'task_start', 'task': 'calculate_qubit_population', 'index': [], 'time': '2024-12-19 17:11:42.009972+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'result': [{'filename': 'run-experiment.output.json'}], 'sweep_points': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'calculate-qubit-population.input.options.json'}]}}, {'event': 'task_end', 'task': 'calculate_qubit_population', 'index': [], 'time': '2024-12-19 17:11:42.018347+00:00', 'output': {'q0': [{'error': "Type <class 'dict'> not supported by the serializer [name: calculate_qubit_population.output.q0]."}]}}, {'event': 'task_start', 'task': 'fit_data', 'index': [], 'time': '2024-12-19 17:11:42.019073+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': [{'error': "Type <class 'dict'> not supported by the serializer [name: fit_data.input.processed_data_dict]."}], 'options': [{'filename': 'fit-data.input.options.json'}]}}, {'event': 'task_end', 'task': 'fit_data', 'index': [], 'time': '2024-12-19 17:11:42.410371+00:00', 'output': {'q0': [{'error': "Type <class 'lmfit.model.ModelResult'> not supported by the serializer [name: fit_data.output.q0]."}]}}, {'event': 'task_start', 'task': 'extract_qubit_parameters', 'index': [], 'time': '2024-12-19 17:11:42.411126+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': [{'error': "Type <class 'dict'> not supported by the serializer [name: extract_qubit_parameters.input.processed_data_dict]."}], 'fit_results': [{'error': "Type <class 'dict'> not supported by the serializer [name: extract_qubit_parameters.input.fit_results]."}], 'options': [{'filename': 'extract-qubit-parameters.input.options.json'}]}}, {'event': 'log', 'message': 'Could not extract pi- and pi/2-pulse amplitudes for q0.', 'time': '2024-12-19 17:11:42.421321+00:00', 'level': 40}, {'event': 'task_end', 'task': 'extract_qubit_parameters', 'index': [], 'time': '2024-12-19 17:11:42.421380+00:00', 'output': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}}, {'event': 'task_start', 'task': 'plot_raw_complex_data_1d', 'index': [], 'time': '2024-12-19 17:11:42.421495+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'result': [{'filename': 'run-experiment.output.json'}], 'sweep_points': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'xlabel': 'Amplitude Scaling', 'xscaling': 1.0, 'options': [{'filename': 'plot-raw-complex-data-1d.input.options.json'}]}}, {'event': 'artifact', 'time': '2024-12-19 17:11:42.441662+00:00', 'artifact_name': 'Raw_data_q0', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Raw-data-q0.png'}]}, {'event': 'task_end', 'task': 'plot_raw_complex_data_1d', 'index': [], 'time': '2024-12-19 17:11:42.651037+00:00', 'output': {'q0': [{'filename': 'Raw-data-q0.png'}]}}, {'event': 'task_start', 'task': 'plot_population', 'index': [], 'time': '2024-12-19 17:11:42.651208+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': [{'error': "Type <class 'dict'> not supported by the serializer [name: plot_population.input.processed_data_dict]."}], 'fit_results': [{'error': "Type <class 'dict'> not supported by the serializer [name: plot_population.input.fit_results]."}], 'qubit_parameters': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}, 'options': [{'filename': 'plot-population.input.options.json'}]}}, {'event': 'artifact', 'time': '2024-12-19 17:11:42.671185+00:00', 'artifact_name': 'Rabi_q0', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Rabi-q0.png'}]}, {'event': 'task_end', 'task': 'plot_population', 'index': [], 'time': '2024-12-19 17:11:42.766617+00:00', 'output': {'q0': [{'filename': 'Rabi-q0.png'}]}}, {'event': 'end', 'workflow': 'analysis_workflow', 'index': [], 'time': '2024-12-19 17:11:42.766722+00:00', 'output': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}}, {'event': 'end', 'workflow': 'amplitude_rabi', 'index': [], 'time': '2024-12-19 17:11:42.766837+00:00', 'output': [{'filename': 'run-experiment.output.json'}]}]
In the remaining sections we'll look at how to write adhoc comments into the logs and how to save data files to disk.
The timestamp of the start time of the workflow execution and the name(s) of the currently executed workflow(s) (if the task was executed from a workflow) can be obtained from within a task. If the task was not called from within a workflow execution context, the timestamp will be None and the workflow names will be an empty list. Timestamp and the first of the workflow names are also part of the folder path in case a folder logger is used. Here is an example of a task which reads the outermost workflow's name and the timestamp:
from laboneq.workflow import (
execution_info,
task,
workflow,
)
@task
def folder_logger_timestamp_and_workflow_name():
info = execution_info() # Returns a WorkflowExecutionInfoView object
return (info.workflows[0], info.start_time)
@workflow
def timestamp_and_name_workflow():
folder_logger_timestamp_and_workflow_name()
wf = timestamp_and_name_workflow()
result = wf.run()
print(result.tasks["folder_logger_timestamp_and_workflow_name"].output)
('timestamp_and_name_workflow', datetime.datetime(2024, 12, 19, 17, 11, 42, 798507, tzinfo=datetime.timezone.utc))
The output of WorkflowExecutionInfoView.workflows
is a list, where the outermost workflow is the first element and the innermost workflow is the last element. The output of WorkflowExecutionInfoView.start_time
is a datetime.datetime
object, which is used for creating the folder logger's data folder in the format YYYYMMDDTHHMMSS
(using strftime("%Y%m%dT%H%M%S")
) after conversion from UTC to local time.
Loading back data from a fileĀ¶
Currently, the FolderStore
cannot be used to load back data from a saved file. This functionality will be added soonn.
To load back an object saved by a Workflow
, use:
from laboneq import serializers
my_object= serializers.load(path_to_file)
Here, path_to_file
is the full path to the data file.
Logging comments from within tasksĀ¶
Logbooks allow tasks to add their own messages to the logbook as comments.
This is done by calling the comment(...)
function within a task.
We'll work through an example below:
from laboneq.workflow import comment, task, workflow
Let's write a small workflow and a tiny task that just writes a comment to the logbook:
@task
def log_a_comment(msg):
comment(msg)
@workflow
def demo_comments():
log_a_comment("Activating multi-state discrimination! <sirens blare>")
log_a_comment("Analysis successful! <cheers>")
Now when we run the workflow we'll see the comments appear in the logs:
wf = demo_comments()
result = wf.run()
Above you should see the two comments. They look like this:
Comment: Activating multi-state discrimination! <sirens blare>
...
Comment: Analysis successful! <cheers>
In addition to comment(...)
, the logbook supports a function log(level: int, message: str, *args: object)
which logs a message at the specified logging level similar to Python's logging
module. This additional function is useful for logging messages that are not regular user comments, but allow tasks to give feedback about issues which are still important to record.
Store data from within tasksĀ¶
Logbooks also allow files to be saved to disk using the function save_artifact
.
Here we will create a figure with matplotlib and save it to disk. The folder store will automatically save it as a PNG.
The kinds of objects the folder store can currently save are:
- Python strings (saved as a text file)
- Python bytes (saved as raw data)
- Pydantic models (saved as JSON)
- PIL images (saved as PNGs by default)
- Matplotlib figures (saved as PNGs by default)
- Numpy arrays (saved as Numpy data files)
Support for more kinds of objects coming soon (e.g. DeviceSetup
, Experiment
).
import PIL
from laboneq.workflow import save_artifact
from matplotlib import pyplot as plt
Let's write a small workflow that plots the sine function and saves the plot using save_artifact
:
@task
def sine_plot():
fig = plt.figure()
plt.title("A sine wave")
x = np.linspace(0, 2 * np.pi, 100)
y = np.sin(x)
plt.plot(x, y)
save_artifact("Sine Plot", fig)
@workflow
def demo_saving():
sine_plot()
Since we deactivated the folder store, let's activate it again now:
folder_store.activate()
And run our workflow:
You can see in the logs that an artifact was created:
Artifact: 'Sine Plot' of type 'Figure' logged
Now let's load the image from disk.
First we need to find the logbook folder created for our workflow:
demo_saving_folders = sorted(store_folder.glob("*/*-demo-saving"))
demo_saving_folder = demo_saving_folders[-1]
demo_saving_folder
PosixPath('experiment_store/20241219/20241219T171142-demo-saving')
And let's list its contents:
sorted(demo_saving_folder.iterdir())
[PosixPath('experiment_store/20241219/20241219T171142-demo-saving/Sine Plot.png'), PosixPath('experiment_store/20241219/20241219T171142-demo-saving/demo-saving.input.options.json'), PosixPath('experiment_store/20241219/20241219T171142-demo-saving/log.jsonl')]
And finally let's load the saved image using PIL:
Saving an object also generates an entry in the folder store log.
We can view it by opening the log:
experiment_log = demo_saving_folder / "log.jsonl"
logs = [json.loads(line) for line in experiment_log.read_text().splitlines()]
logs
[{'event': 'start', 'workflow': 'demo_saving', 'index': [], 'time': '2024-12-19 17:11:42.828459+00:00', 'input': {'options': [{'filename': 'demo-saving.input.options.json'}]}}, {'event': 'task_start', 'task': 'sine_plot', 'index': [], 'time': '2024-12-19 17:11:42.829302+00:00', 'input': {}}, {'event': 'artifact', 'time': '2024-12-19 17:11:42.835803+00:00', 'artifact_name': 'Sine Plot', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Sine Plot.png'}]}, {'event': 'task_end', 'task': 'sine_plot', 'index': [], 'time': '2024-12-19 17:11:42.923440+00:00', 'output': None}, {'event': 'end', 'workflow': 'demo_saving', 'index': [], 'time': '2024-12-19 17:11:42.923516+00:00', 'output': None}]
As you can see above the log records the name (artifact_name
) and type (artifact_type
) of the object saved, and the name of the file it was written to (artifact_files
)
Saving an artifact might potentially write multiple files to disk.
The artifact_metadata
contains additional user supplied information about the object saved, while artifact_options
provide initial information on how to save the object. For example, we could have elected to save the figure in another file format. We'll see how to use both next.
Specifying metadata and options when savingĀ¶
Let's again make a small workflow that saves a plot, but this time we'll add some options and metadata.
@task
def sine_plot_with_options():
fig = plt.figure()
plt.title("A sine wave")
x = np.linspace(0, 2 * np.pi, 100)
y = np.sin(x)
plt.plot(x, y)
[ax] = fig.get_axes()
save_artifact(
"Sine Plot",
fig,
metadata={
"title": ax.get_title(),
},
options={
"format": "jpg",
},
)
@workflow
def demo_saving_with_options():
sine_plot_with_options()
And run the workflow to save the plot:
Again we open the workflow folder and load the saved image:
demo_saving_with_options_folders = sorted(
store_folder.glob("*/*-demo-saving-with-options")
)
demo_saving_with_options_folder = demo_saving_with_options_folders[-1]
demo_saving_with_options_folder
PosixPath('experiment_store/20241219/20241219T171143-demo-saving-with-options')
sorted(demo_saving_with_options_folder.iterdir())
[PosixPath('experiment_store/20241219/20241219T171143-demo-saving-with-options/Sine Plot.jpg'), PosixPath('experiment_store/20241219/20241219T171143-demo-saving-with-options/demo-saving-with-options.input.options.json'), PosixPath('experiment_store/20241219/20241219T171143-demo-saving-with-options/log.jsonl')]
Now when we load the image it is very slightly blurry, because it was saved as a JPEG which uses lossy compression:
And if we view the logs we can see that the title was recorded in the artifact_metadata
:
experiment_log = demo_saving_with_options_folder / "log.jsonl"
logs = [json.loads(line) for line in experiment_log.read_text().splitlines()]
logs
[{'event': 'start', 'workflow': 'demo_saving_with_options', 'index': [], 'time': '2024-12-19 17:11:43.061585+00:00', 'input': {'options': [{'filename': 'demo-saving-with-options.input.options.json'}]}}, {'event': 'task_start', 'task': 'sine_plot_with_options', 'index': [], 'time': '2024-12-19 17:11:43.062494+00:00', 'input': {}}, {'event': 'artifact', 'time': '2024-12-19 17:11:43.069117+00:00', 'artifact_name': 'Sine Plot', 'artifact_type': 'Figure', 'artifact_metadata': {'title': 'A sine wave'}, 'artifact_options': {'format': 'jpg'}, 'artifact_files': [{'filename': 'Sine Plot.jpg'}]}, {'event': 'task_end', 'task': 'sine_plot_with_options', 'index': [], 'time': '2024-12-19 17:11:43.144413+00:00', 'output': None}, {'event': 'end', 'workflow': 'demo_saving_with_options', 'index': [], 'time': '2024-12-19 17:11:43.144468+00:00', 'output': None}]
The supported options for saving artifacts depend on the type of artifact. For our matplotlib figure example, the options are forwarded to matplotlib.pyplot.savefig
and are documented in the Matplotlib documentation, with the following changes to the default values:
format
is set to "png" by defaultbbox_inches
is set to "tight" by default
In the same way, the options for a PIL.Image.Image
are forwarded to PIL.Image.Image.save
and are documented in the Pillow documentation with the format defaulting to "PNG". For a numpy.ndarray
the options are forwarded to numpy.save
and are documented in the Numpy documentation with allow_pickle
set to False
by default.
We're done!