Recording Experiment Workflow ResultsĀ¶
While running an experiment workflow one would like to keep a record of what took place -- a kind of digital lab book. The LabOne Q Applications Library provides logbooks for just this task.
Each workflow run creates its own logbook. The logbook records the tasks being run and may also be used to store additional data such as device settings, LabOne Q experiments, qubits, and the results of experiments and analyses.
Logbooks need to be stored somewhere, and within the Applications Library, this place is called a logbook store.
Currently the Applications Library supports two kinds of stores:
FolderStore
LoggingStore
The FolderStore
writes logbooks to a folder on disk. It is used to keep a permanent record of the experiment workflow.
The LoggingStore
logs what is happening using Python's logging. It provides a quick overview of the steps performed by a workflow.
We'll look at each of these in more detail shortly, but first let us set up a quantum platform to run some experiments on so we have something to record.
Setting up a quantum platformĀ¶
Build your LabOne Q DeviceSetup
, qubits and Session
as normal. Here we import a demonstration tunable transmon quantum platform from the library and the amplitude Rabi experiment:
import numpy as np
from laboneq.simple import *
from laboneq_applications.experiments import amplitude_rabi
from laboneq_applications.qpu_types.tunable_transmon import demo_platform
# Create a demonstration QuantumPlatform for a tunable-transmon QPU:
qt_platform = demo_platform(n_qubits=6)
# The platform contains a setup, which is an ordinary LabOne Q DeviceSetup:
setup = qt_platform.setup
# And a tunable-transmon QPU:
qpu = qt_platform.qpu
# Inside the QPU, we have qubits, which is a list of six LabOne Q Application
# Library TunableTransmonQubit qubits:
qubits = qpu.qubits
session = Session(setup)
session.connect(do_emulation=True)
[2024.11.07 16:42:07.892] INFO Logging initialized from [Default inline config in laboneq.laboneq_logging] logdir is /builds/qccs/laboneq-applications/docs/sources/tutorials/sources/laboneq_output/log
[2024.11.07 16:42:07.898] INFO VERSION: laboneq 2.41.0
[2024.11.07 16:42:07.899] INFO Connecting to data server at localhost:8004
[2024.11.07 16:42:07.903] INFO Connected to Zurich Instruments LabOne Data Server version 24.10 at localhost:8004
[2024.11.07 16:42:08.053] INFO Configuring the device setup
[2024.11.07 16:42:08.095] INFO The device setup is configured
<laboneq.dsl.session.ConnectionState at 0x7ef2dd9e1dc0>
The LoggingStoreĀ¶
When you import the laboneq_applications
library it automatically creates a default LoggingStore
for you. This logging store is used whenever a workflow is executed and logs information about:
- the start and end of workflows
- the start and end of tasks
- any errors that occur
- comments (adhoc messages from tasks, more on these later)
- any data files that would be saved if a folder store was in use (more on these later too)
These logs don't save anything on disk, but they will allow us to see what events are recorded and what would be saved if we did a have a folder store active.
An example of loggingĀ¶
Let's run the amplitude Rabi experiment and take a look:
amplitudes = np.linspace(0.0, 0.9, 10)
options = amplitude_rabi.experiment_workflow.options()
options.count(10)
options.averaging_mode("cyclic")
rabi_tb = amplitude_rabi.experiment_workflow(
session,
qpu,
qubits[0],
amplitudes,
options=options,
)
The workflow has not yet been executed, but when you run the next cell, you should see messages like:
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Workflow 'amplitude_rabi': execution started
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
appear in the logs beneath the cell.
result = rabi_tb.run()
[2024.11.07 16:42:08.125] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.126] INFO Workflow 'amplitude_rabi': execution started at 2024-11-07 16:42:08.124559Z
[2024.11.07 16:42:08.126] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.128] INFO Task 'temporary_modify': started at 2024-11-07 16:42:08.127483Z
[2024.11.07 16:42:08.128] INFO Task 'temporary_modify': ended at 2024-11-07 16:42:08.128488Z
[2024.11.07 16:42:08.129] INFO Task 'create_experiment': started at 2024-11-07 16:42:08.129386Z
[2024.11.07 16:42:08.133] INFO Task 'create_experiment': ended at 2024-11-07 16:42:08.133500Z
[2024.11.07 16:42:08.134] INFO Task 'compile_experiment': started at 2024-11-07 16:42:08.134509Z
[2024.11.07 16:42:08.147] INFO Resolved modulation type of oscillator 'q0_readout_acquire_osc' on signal '/logical_signal_groups/q0/acquire' to SOFTWARE
[2024.11.07 16:42:08.148] INFO Resolved modulation type of oscillator 'q0_drive_ge_osc' on signal '/logical_signal_groups/q0/drive' to HARDWARE
[2024.11.07 16:42:08.149] INFO Resolved modulation type of oscillator 'q0_drive_ef_osc' on signal '/logical_signal_groups/q0/drive_ef' to HARDWARE
[2024.11.07 16:42:08.150] INFO Starting LabOne Q Compiler run...
[2024.11.07 16:42:08.165] INFO Schedule completed. [0.011 s]
[2024.11.07 16:42:08.211] INFO Code generation completed for all AWGs. [0.045 s]
[2024.11.07 16:42:08.211] INFO Completed compilation step 1 of 1. [0.057 s]
[2024.11.07 16:42:08.218] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.219] INFO Device AWG SeqC LOC CT entries Waveforms Samples
[2024.11.07 16:42:08.220] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.220] INFO device_hdawg 0 4 1 0 0
[2024.11.07 16:42:08.220] INFO device_shfqc 0 17 0 1 8000
[2024.11.07 16:42:08.221] INFO device_shfqc_sg 0 35 11 2 448
[2024.11.07 16:42:08.221] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.222] INFO TOTAL 56 12 8448
[2024.11.07 16:42:08.222] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.230] INFO Finished LabOne Q Compiler run.
[2024.11.07 16:42:08.235] INFO Task 'compile_experiment': ended at 2024-11-07 16:42:08.234742Z
[2024.11.07 16:42:08.236] INFO Task 'run_experiment': started at 2024-11-07 16:42:08.235995Z
[2024.11.07 16:42:08.250] INFO Starting near-time execution...
[2024.11.07 16:42:08.279] INFO Finished near-time execution.
[2024.11.07 16:42:08.282] INFO Task 'run_experiment': ended at 2024-11-07 16:42:08.281652Z
[2024.11.07 16:42:08.283] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.284] INFO Workflow 'analysis_workflow': execution started at 2024-11-07
[2024.11.07 16:42:08.285] INFO 16:42:08.282929Z
[2024.11.07 16:42:08.285] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.286] INFO Task 'calculate_qubit_population': started at 2024-11-07 16:42:08.286118Z
[2024.11.07 16:42:08.288] INFO Task 'calculate_qubit_population': ended at 2024-11-07 16:42:08.287669Z
[2024.11.07 16:42:08.289] INFO Task 'fit_data': started at 2024-11-07 16:42:08.288702Z
[2024.11.07 16:42:08.907] INFO Task 'fit_data': ended at 2024-11-07 16:42:08.907071Z
[2024.11.07 16:42:08.909] INFO Task 'extract_qubit_parameters': started at 2024-11-07 16:42:08.908708Z
[2024.11.07 16:42:08.912] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.11.07 16:42:08.912] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.11.07 16:42:08.914] INFO Task 'extract_qubit_parameters': ended at 2024-11-07 16:42:08.914027Z
[2024.11.07 16:42:08.916] INFO Task 'plot_raw_complex_data_1d': started at 2024-11-07 16:42:08.915635Z
[2024.11.07 16:42:08.965] INFO Artifact: 'Raw_data_q0' of type 'Figure' logged at 2024-11-07 16:42:08.964957Z
[2024.11.07 16:42:08.967] INFO Task 'plot_raw_complex_data_1d': ended at 2024-11-07 16:42:08.966655Z
[2024.11.07 16:42:08.968] INFO Task 'plot_population': started at 2024-11-07 16:42:08.968053Z
[2024.11.07 16:42:08.990] INFO Artifact: 'Rabi_q0' of type 'Figure' logged at 2024-11-07 16:42:08.990115Z
[2024.11.07 16:42:08.992] INFO Task 'plot_population': ended at 2024-11-07 16:42:08.992159Z
[2024.11.07 16:42:08.993] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.994] INFO Workflow 'analysis_workflow': execution ended at 2024-11-07 16:42:08.993320Z
[2024.11.07 16:42:08.994] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.996] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:08.996] INFO Workflow 'amplitude_rabi': execution ended at 2024-11-07 16:42:08.995645Z
[2024.11.07 16:42:08.997] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
And that's all there is to the basic logging functionality.
Advanced logging usesĀ¶
If you need to create a logging store of your own you can do so as follows:
from laboneq.workflow.logbook import LoggingStore
logging_store = LoggingStore()
The logging store created above won't be active unless you run:
logging_store.activate()
And you deactivate it with:
logging_store.deactivate()
You can access the default logging store by importing it from laboneq.workflow.logbook
:
from laboneq.workflow.logbook import DEFAULT_LOGGING_STORE
DEFAULT_LOGGING_STORE
<laboneq.workflow.logbook.logging_store.LoggingStore at 0x7ef37eb82f30>
You can also inspect all the active logbook stores:
from laboneq.workflow.logbook import active_logbook_stores
active_logbook_stores()
[<laboneq.workflow.logbook.logging_store.LoggingStore at 0x7ef37eb82f30>]
The FolderStoreĀ¶
Using the folder storeĀ¶
The FolderStore
saves workflow results on disk and is likely the most important logbook store you'll use.
You can import it as follows:
from laboneq.workflow.logbook import FolderStore
To create a folder store you'll need to pick a folder on disk to store logbooks in. Here we select ./experiment_store
as the folder name but you should pick your own.
Each logbook created by a workflow will have its own sub-folder. The sub-folder name will start with a timestamp, followed by the name of the workflow, for example 20240728T175500-amplitude-rabi/
. If necessary, a unique count will be added at the end to make the sub-folder name unique.
The timestamps are in UTC, so they might be offset from your local time, but will be meaningful to users in other timezones and will remain correctly ordered when changing to or from daylight savings.
The folder store will need to be activated before workflows will use it automatically.
folder_store = FolderStore("./experiment_store")
folder_store.activate()
Now let's run the amplitude Rabi workflow. As before we'll see the task events being logged. Afterwards we'll explore the folder to see what has been written to disk.
result = rabi_tb.run()
[2024.11.07 16:42:09.052] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.053] INFO Workflow 'amplitude_rabi': execution started at 2024-11-07 16:42:09.051666Z
[2024.11.07 16:42:09.053] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.127] INFO Task 'temporary_modify': started at 2024-11-07 16:42:09.127104Z
[2024.11.07 16:42:09.129] INFO Task 'temporary_modify': ended at 2024-11-07 16:42:09.128967Z
[2024.11.07 16:42:09.130] INFO Task 'create_experiment': started at 2024-11-07 16:42:09.130251Z
[2024.11.07 16:42:09.145] INFO Task 'create_experiment': ended at 2024-11-07 16:42:09.145195Z
[2024.11.07 16:42:09.158] INFO Task 'compile_experiment': started at 2024-11-07 16:42:09.157909Z
[2024.11.07 16:42:09.169] INFO Resolved modulation type of oscillator 'q0_readout_acquire_osc' on signal '/logical_signal_groups/q0/acquire' to SOFTWARE
[2024.11.07 16:42:09.170] INFO Resolved modulation type of oscillator 'q0_drive_ge_osc' on signal '/logical_signal_groups/q0/drive' to HARDWARE
[2024.11.07 16:42:09.170] INFO Resolved modulation type of oscillator 'q0_drive_ef_osc' on signal '/logical_signal_groups/q0/drive_ef' to HARDWARE
[2024.11.07 16:42:09.171] INFO Starting LabOne Q Compiler run...
[2024.11.07 16:42:09.182] INFO Schedule completed. [0.008 s]
[2024.11.07 16:42:09.217] INFO Code generation completed for all AWGs. [0.034 s]
[2024.11.07 16:42:09.218] INFO Completed compilation step 1 of 1. [0.044 s]
[2024.11.07 16:42:09.223] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.225] INFO Device AWG SeqC LOC CT entries Waveforms Samples
[2024.11.07 16:42:09.225] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.226] INFO device_hdawg 0 4 1 0 0
[2024.11.07 16:42:09.226] INFO device_shfqc 0 17 0 1 8000
[2024.11.07 16:42:09.226] INFO device_shfqc_sg 0 35 11 2 448
[2024.11.07 16:42:09.227] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.228] INFO TOTAL 56 12 8448
[2024.11.07 16:42:09.228] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.234] INFO Finished LabOne Q Compiler run.
[2024.11.07 16:42:09.238] INFO Task 'compile_experiment': ended at 2024-11-07 16:42:09.238144Z
[2024.11.07 16:42:09.262] INFO Task 'run_experiment': started at 2024-11-07 16:42:09.262455Z
[2024.11.07 16:42:09.272] INFO Starting near-time execution...
[2024.11.07 16:42:09.297] INFO Finished near-time execution.
[2024.11.07 16:42:09.299] INFO Task 'run_experiment': ended at 2024-11-07 16:42:09.298624Z
[2024.11.07 16:42:09.301] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.302] INFO Workflow 'analysis_workflow': execution started at 2024-11-07
[2024.11.07 16:42:09.303] INFO 16:42:09.301430Z
[2024.11.07 16:42:09.303] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:09.351] INFO Task 'calculate_qubit_population': started at 2024-11-07 16:42:09.350890Z
[2024.11.07 16:42:09.365] INFO Task 'calculate_qubit_population': ended at 2024-11-07 16:42:09.364718Z
[2024.11.07 16:42:09.366] INFO Task 'fit_data': started at 2024-11-07 16:42:09.366627Z
[2024.11.07 16:42:09.992] INFO Task 'fit_data': ended at 2024-11-07 16:42:09.992247Z
[2024.11.07 16:42:09.994] INFO Task 'extract_qubit_parameters': started at 2024-11-07 16:42:09.994471Z
[2024.11.07 16:42:10.014] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.11.07 16:42:10.014] ERROR Could not extract pi- and pi/2-pulse amplitudes for q0.
[2024.11.07 16:42:10.016] INFO Task 'extract_qubit_parameters': ended at 2024-11-07 16:42:10.016081Z
[2024.11.07 16:42:10.017] INFO Task 'plot_raw_complex_data_1d': started at 2024-11-07 16:42:10.017596Z
[2024.11.07 16:42:10.061] INFO Artifact: 'Raw_data_q0' of type 'Figure' logged at 2024-11-07 16:42:10.060375Z
[2024.11.07 16:42:10.362] INFO Task 'plot_raw_complex_data_1d': ended at 2024-11-07 16:42:10.362310Z
[2024.11.07 16:42:10.364] INFO Task 'plot_population': started at 2024-11-07 16:42:10.364037Z
[2024.11.07 16:42:10.386] INFO Artifact: 'Rabi_q0' of type 'Figure' logged at 2024-11-07 16:42:10.385870Z
[2024.11.07 16:42:10.490] INFO Task 'plot_population': ended at 2024-11-07 16:42:10.489625Z
[2024.11.07 16:42:10.492] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.492] INFO Workflow 'analysis_workflow': execution ended at 2024-11-07 16:42:10.491641Z
[2024.11.07 16:42:10.493] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.494] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.495] INFO Workflow 'amplitude_rabi': execution ended at 2024-11-07 16:42:10.494102Z
[2024.11.07 16:42:10.495] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
If you no longer wish to automatically store workflow results in the folder store, you can deactivate it with:
folder_store.deactivate()
Exploring what was written to diskĀ¶
Here we will use Python's pathlib
functionality to explore what has been written to disk, but you can also use whatever ordinary tools you prefer (terminal, file navigator).
import json
from pathlib import Path
Remember that above we requested that the folder store use a folder named experiment_store
. Let's list the logbooks that were created in that folder:
store_folder = Path("experiment_store")
amplitude_rabi_folders = sorted(store_folder.glob("*/*-amplitude-rabi"))
Our amplitude Rabi experiment is the most recent one run, so let's look at the files within the most recent folder. Note that the logbook folder names start with a timestamp followed by the name of the workflow, which allows us to easily order them by time and to find the workflow we're looking for:
amplitude_rabi_folder = amplitude_rabi_folders[-1]
amplitude_rabi_files = sorted(
amplitude_rabi_folder.iterdir()
)
amplitude_rabi_files
[PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/Rabi-q0.png'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/Raw-data-q0.png'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/amplitude-rabi.input.amplitudes.npy'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/amplitude-rabi.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/amplitude-rabi.input.qpu.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/amplitude-rabi.input.qubits.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/analysis-workflow.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/calculate-qubit-population.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/compile-experiment.output.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/create-experiment.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/create-experiment.output.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/extract-qubit-parameters.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/fit-data.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/log.jsonl'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/plot-population.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/plot-raw-complex-data-1d.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/run-experiment.input.options.json'), PosixPath('experiment_store/20241107/20241107T164209-amplitude-rabi/run-experiment.output.json')]
Let us look at the file log.jsonl
. This is the log of what took place. The log is stored in a format called "JSONL" which means each line of the log is a simple Python dictionary stored as JSON. Larger objects and certain types of data are stored as separate files in a subfolder called obj
or, for some important data, in the same folder.
Let's open the file and list the logs:
experiment_log = amplitude_rabi_folder / "log.jsonl"
logs = [
json.loads(line) for line in experiment_log.read_text().splitlines()
]
logs
[{'event': 'start', 'workflow': 'amplitude_rabi', 'time': '2024-11-07 16:42:09.051666+00:00', 'input': {'session': '...', 'qpu': [{'filename': 'amplitude-rabi.input.qpu.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'temporary_parameters': None, 'options': [{'filename': 'amplitude-rabi.input.options.json'}]}}, {'event': 'task_start', 'task': 'temporary_modify', 'time': '2024-11-07 16:42:09.127104+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'temporary_parameters': None}}, {'event': 'task_end', 'task': 'temporary_modify', 'time': '2024-11-07 16:42:09.128967+00:00', 'output': [{'filename': 'amplitude-rabi.input.qubits.json'}]}, {'event': 'task_start', 'task': 'create_experiment', 'time': '2024-11-07 16:42:09.130251+00:00', 'input': {'qpu': [{'filename': 'amplitude-rabi.input.qpu.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'create-experiment.input.options.json'}]}}, {'event': 'task_end', 'task': 'create_experiment', 'time': '2024-11-07 16:42:09.145195+00:00', 'output': [{'filename': 'create-experiment.output.json'}]}, {'event': 'task_start', 'task': 'compile_experiment', 'time': '2024-11-07 16:42:09.157909+00:00', 'input': {'session': '...', 'experiment': [{'filename': 'create-experiment.output.json'}], 'compiler_settings': None}}, {'event': 'task_end', 'task': 'compile_experiment', 'time': '2024-11-07 16:42:09.238144+00:00', 'output': [{'filename': 'compile-experiment.output.json'}]}, {'event': 'task_start', 'task': 'run_experiment', 'time': '2024-11-07 16:42:09.262455+00:00', 'input': {'session': '...', 'compiled_experiment': [{'filename': 'compile-experiment.output.json'}], 'options': [{'filename': 'run-experiment.input.options.json'}]}}, {'event': 'task_end', 'task': 'run_experiment', 'time': '2024-11-07 16:42:09.298624+00:00', 'output': [{'filename': 'run-experiment.output.json'}]}, {'event': 'start', 'workflow': 'analysis_workflow', 'time': '2024-11-07 16:42:09.301430+00:00', 'input': {'result': [{'filename': 'run-experiment.output.json'}], 'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'amplitudes': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'analysis-workflow.input.options.json'}]}}, {'event': 'task_start', 'task': 'calculate_qubit_population', 'time': '2024-11-07 16:42:09.350890+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'result': [{'filename': 'run-experiment.output.json'}], 'sweep_points': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'options': [{'filename': 'calculate-qubit-population.input.options.json'}]}}, {'event': 'task_end', 'task': 'calculate_qubit_population', 'time': '2024-11-07 16:42:09.364718+00:00', 'output': {'q0': '...'}}, {'event': 'task_start', 'task': 'fit_data', 'time': '2024-11-07 16:42:09.366627+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': '...', 'options': [{'filename': 'fit-data.input.options.json'}]}}, {'event': 'task_end', 'task': 'fit_data', 'time': '2024-11-07 16:42:09.992247+00:00', 'output': {'q0': '...'}}, {'event': 'task_start', 'task': 'extract_qubit_parameters', 'time': '2024-11-07 16:42:09.994471+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': '...', 'fit_results': '...', 'options': [{'filename': 'extract-qubit-parameters.input.options.json'}]}}, {'event': 'log', 'message': 'Could not extract pi- and pi/2-pulse amplitudes for q0.', 'time': '2024-11-07 16:42:10.015625+00:00', 'level': 40}, {'event': 'task_end', 'task': 'extract_qubit_parameters', 'time': '2024-11-07 16:42:10.016081+00:00', 'output': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}}, {'event': 'task_start', 'task': 'plot_raw_complex_data_1d', 'time': '2024-11-07 16:42:10.017596+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'result': [{'filename': 'run-experiment.output.json'}], 'sweep_points': [{'filename': 'amplitude-rabi.input.amplitudes.npy'}], 'xlabel': 'Amplitude Scaling', 'xscaling': 1.0, 'options': [{'filename': 'plot-raw-complex-data-1d.input.options.json'}]}}, {'event': 'artifact', 'time': '2024-11-07 16:42:10.060375+00:00', 'artifact_name': 'Raw_data_q0', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Raw-data-q0.png'}]}, {'event': 'task_end', 'task': 'plot_raw_complex_data_1d', 'time': '2024-11-07 16:42:10.362310+00:00', 'output': {'q0': [{'filename': 'Raw-data-q0.png'}]}}, {'event': 'task_start', 'task': 'plot_population', 'time': '2024-11-07 16:42:10.364037+00:00', 'input': {'qubits': [{'filename': 'amplitude-rabi.input.qubits.json'}], 'processed_data_dict': '...', 'fit_results': '...', 'qubit_parameters': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}, 'options': [{'filename': 'plot-population.input.options.json'}]}}, {'event': 'artifact', 'time': '2024-11-07 16:42:10.385870+00:00', 'artifact_name': 'Rabi_q0', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Rabi-q0.png'}]}, {'event': 'task_end', 'task': 'plot_population', 'time': '2024-11-07 16:42:10.489625+00:00', 'output': {'q0': [{'filename': 'Rabi-q0.png'}]}}, {'event': 'end', 'workflow': 'analysis_workflow', 'time': '2024-11-07 16:42:10.491641+00:00', 'output': {'old_parameter_values': {'q0': {'ge_drive_amplitude_pi': 0.8, 'ge_drive_amplitude_pi2': 0.4}}, 'new_parameter_values': {'q0': {}}}}, {'event': 'end', 'workflow': 'amplitude_rabi', 'time': '2024-11-07 16:42:10.494102+00:00', 'output': [{'filename': 'run-experiment.output.json'}]}]
In the remaining sections we'll look at how to write adhoc comments into the logs and how to save data files to disk.
The timestamp of the start time of the workflow execution and the name(s) of the currently executed workflow(s) (if the task was executed from a workflow) can be obtained from within a task. If the task was not called from within a workflow execution context, the timestamp will be None and the workflow names will be an empty list. Timestamp and the first of the workflow names are also part of the folder path in case a folder logger is used. Here is an example of a task which reads the outermost workflow's name and the timestamp:
from laboneq.workflow import (
execution_info,
task,
workflow,
)
@task
def folder_logger_timestamp_and_workflow_name():
info = execution_info() # Returns a WorkflowExecutionInfoView object
return (info.workflows[0], info.start_time)
@workflow
def timestamp_and_name_workflow():
folder_logger_timestamp_and_workflow_name()
wf = timestamp_and_name_workflow()
result = wf.run()
print(result.tasks["folder_logger_timestamp_and_workflow_name"].output)
[2024.11.07 16:42:10.537] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.538] INFO Workflow 'timestamp_and_name_workflow': execution started at 2024-11-07
[2024.11.07 16:42:10.539] INFO 16:42:10.536506Z
[2024.11.07 16:42:10.540] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.541] INFO Task 'folder_logger_timestamp_and_workflow_name': started at 2024-11-07
[2024.11.07 16:42:10.542] INFO 16:42:10.540961Z
[2024.11.07 16:42:10.543] INFO Task 'folder_logger_timestamp_and_workflow_name': ended at 2024-11-07
[2024.11.07 16:42:10.543] INFO 16:42:10.542589Z
[2024.11.07 16:42:10.544] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.545] INFO Workflow 'timestamp_and_name_workflow': execution ended at 2024-11-07
[2024.11.07 16:42:10.545] INFO 16:42:10.544108Z
[2024.11.07 16:42:10.546] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
('timestamp_and_name_workflow', datetime.datetime(2024, 11, 7, 16, 42, 10, 536027, tzinfo=datetime.timezone.utc))
The output of WorkflowExecutionInfoView.workflows
is a list, where the outermost workflow is the first element and the innermost workflow is the last element. The output of WorkflowExecutionInfoView.start_time
is a datetime.datetime
object, which is used for creating the folder logger's data folder in the format YYYYMMDDTHHMMSS
(using strftime("%Y%m%dT%H%M%S")
) after conversion from UTC to local time.
Logging comments from within tasksĀ¶
Logbooks allow tasks to add their own messages to the logbook as comments.
This is done by calling the comment(...)
function within a task.
We'll work through an example below:
from laboneq.workflow import comment, task, workflow
Let's write a small workflow and a tiny task that just writes a comment to the logbook:
@task
def log_a_comment(msg):
comment(msg)
@workflow
def demo_comments():
log_a_comment("Activating multi-state discrimination! <sirens blare>")
log_a_comment("Analysis successful! <cheers>")
Now when we run the workflow we'll see the comments appear in the logs:
wf = demo_comments()
result = wf.run()
[2024.11.07 16:42:10.569] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.570] INFO Workflow 'demo_comments': execution started at 2024-11-07 16:42:10.566809Z
[2024.11.07 16:42:10.571] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.572] INFO Task 'log_a_comment': started at 2024-11-07 16:42:10.571936Z
[2024.11.07 16:42:10.574] INFO Comment: Activating multi-state discrimination! <sirens blare>
[2024.11.07 16:42:10.575] INFO Task 'log_a_comment': ended at 2024-11-07 16:42:10.575000Z
[2024.11.07 16:42:10.576] INFO Task 'log_a_comment': started at 2024-11-07 16:42:10.576122Z
[2024.11.07 16:42:10.577] INFO Comment: Analysis successful! <cheers>
[2024.11.07 16:42:10.578] INFO Task 'log_a_comment': ended at 2024-11-07 16:42:10.577666Z
[2024.11.07 16:42:10.579] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.579] INFO Workflow 'demo_comments': execution ended at 2024-11-07 16:42:10.578655Z
[2024.11.07 16:42:10.579] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Above you should see the two comments. They look like this:
Comment: Activating multi-state discrimination! <sirens blare>
...
Comment: Analysis successful! <cheers>
In addition to comment(...)
, the logbook supports a function log(level: int, message: str, *args: object)
which logs a message at the specified logging level similar to Python's logging
module. This additional function is useful for logging messages that are not regular user comments, but allow tasks to give feedback about issues which are still important to record.
Store data from within tasksĀ¶
Logbooks also allow files to be saved to disk using the function save_artifact
.
Here we will create a figure with matplotlib and save it to disk. The folder store will automatically save it as a PNG.
The kinds of objects the folder store can currently save are:
- Python strings (saved as a text file)
- Python bytes (saved as raw data)
- Pydantic models (saved as JSON)
- PIL images (saved as PNGs by default)
- Matplotlib figures (saved as PNGs by default)
- Numpy arrays (saved as Numpy data files)
Support for more kinds of objects coming soon (e.g. DeviceSetup
, Experiment
).
import PIL
from laboneq.workflow import save_artifact
from matplotlib import pyplot as plt
Let's write a small workflow that plots the sine function and saves the plot using save_artifact
:
@task
def sine_plot():
fig = plt.figure()
plt.title("A sine wave")
x = np.linspace(0, 2 * np.pi, 100)
y = np.sin(x)
plt.plot(x, y)
save_artifact("Sine Plot", fig)
@workflow
def demo_saving():
sine_plot()
Since we deactivated the folder store, let's activate it again now:
folder_store.activate()
And run our workflow:
wf = demo_saving()
result = wf.run()
[2024.11.07 16:42:10.601] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.601] INFO Workflow 'demo_saving': execution started at 2024-11-07 16:42:10.600668Z
[2024.11.07 16:42:10.602] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.609] INFO Task 'sine_plot': started at 2024-11-07 16:42:10.608839Z
[2024.11.07 16:42:10.625] INFO Artifact: 'Sine Plot' of type 'Figure' logged at 2024-11-07 16:42:10.624881Z
[2024.11.07 16:42:10.738] INFO Task 'sine_plot': ended at 2024-11-07 16:42:10.737931Z
[2024.11.07 16:42:10.740] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.741] INFO Workflow 'demo_saving': execution ended at 2024-11-07 16:42:10.739906Z
[2024.11.07 16:42:10.743] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
You can see in the logs that an artifact was created:
Artifact: 'Sine Plot' of type 'Figure' logged
Now let's load the image from disk.
First we need to find the logbook folder created for our workflow:
demo_saving_folders = sorted(store_folder.glob("*/*-demo-saving"))
demo_saving_folder = demo_saving_folders[-1]
demo_saving_folder
PosixPath('experiment_store/20241107/20241107T164210-demo-saving')
And let's list its contents:
sorted(demo_saving_folder.iterdir())
[PosixPath('experiment_store/20241107/20241107T164210-demo-saving/Sine Plot.png'), PosixPath('experiment_store/20241107/20241107T164210-demo-saving/demo-saving.input.options.json'), PosixPath('experiment_store/20241107/20241107T164210-demo-saving/log.jsonl')]
And finally let's load the saved image using PIL:
Saving an object also generates an entry in the folder store log.
We can view it by opening the log:
experiment_log = demo_saving_folder / "log.jsonl"
logs = [
json.loads(line) for line in experiment_log.read_text().splitlines()
]
logs
[{'event': 'start', 'workflow': 'demo_saving', 'time': '2024-11-07 16:42:10.600668+00:00', 'input': {'options': [{'filename': 'demo-saving.input.options.json'}]}}, {'event': 'task_start', 'task': 'sine_plot', 'time': '2024-11-07 16:42:10.608839+00:00', 'input': {}}, {'event': 'artifact', 'time': '2024-11-07 16:42:10.624881+00:00', 'artifact_name': 'Sine Plot', 'artifact_type': 'Figure', 'artifact_metadata': {}, 'artifact_options': {}, 'artifact_files': [{'filename': 'Sine Plot.png'}]}, {'event': 'task_end', 'task': 'sine_plot', 'time': '2024-11-07 16:42:10.737931+00:00', 'output': None}, {'event': 'end', 'workflow': 'demo_saving', 'time': '2024-11-07 16:42:10.739906+00:00', 'output': None}]
As you can see above the log records the name (artifact_name
) and type (artifact_type
) of the object saved, and the name of the file it was written to (artifact_files
)
Saving an artifact might potentially write multiple files to disk.
The artifact_metadata
contains additional user supplied information about the object saved, while artifact_options
provide initial information on how to save the object. For example, we could have elected to save the figure in another file format. We'll see how to use both next.
Specifying metadata and options when savingĀ¶
Let's again make a small workflow that saves a plot, but this time we'll add some options and metadata.
@task
def sine_plot_with_options():
fig = plt.figure()
plt.title("A sine wave")
x = np.linspace(0, 2 * np.pi, 100)
y = np.sin(x)
plt.plot(x, y)
[ax] = fig.get_axes()
save_artifact(
"Sine Plot",
fig,
metadata={
"title": ax.get_title(),
},
options={
"format": "jpg",
},
)
@workflow
def demo_saving_with_options():
sine_plot_with_options()
And run the workflow to save the plot:
wf = demo_saving_with_options()
result = wf.run()
[2024.11.07 16:42:10.877] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.878] INFO Workflow 'demo_saving_with_options': execution started at 2024-11-07
[2024.11.07 16:42:10.878] INFO 16:42:10.877268Z
[2024.11.07 16:42:10.879] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.881] INFO Task 'sine_plot_with_options': started at 2024-11-07 16:42:10.880758Z
[2024.11.07 16:42:10.889] INFO Artifact: 'Sine Plot' of type 'Figure' logged at 2024-11-07 16:42:10.889125Z
[2024.11.07 16:42:10.957] INFO Task 'sine_plot_with_options': ended at 2024-11-07 16:42:10.957361Z
[2024.11.07 16:42:10.959] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
[2024.11.07 16:42:10.959] INFO Workflow 'demo_saving_with_options': execution ended at 2024-11-07
[2024.11.07 16:42:10.960] INFO 16:42:10.958986Z
[2024.11.07 16:42:10.960] INFO āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Again we open the workflow folder and load the saved image:
demo_saving_with_options_folders = sorted(
store_folder.glob("*/*-demo-saving-with-options")
)
demo_saving_with_options_folder = demo_saving_with_options_folders[-1]
demo_saving_with_options_folder
PosixPath('experiment_store/20241107/20241107T164210-demo-saving-with-options')
sorted(demo_saving_with_options_folder.iterdir())
[PosixPath('experiment_store/20241107/20241107T164210-demo-saving-with-options/Sine Plot.jpg'), PosixPath('experiment_store/20241107/20241107T164210-demo-saving-with-options/demo-saving-with-options.input.options.json'), PosixPath('experiment_store/20241107/20241107T164210-demo-saving-with-options/log.jsonl')]
Now when we load the image it is very slightly blurry, because it was saved as a JPEG which uses lossy compression:
And if we view the logs we can see that the title was recorded in the artifact_metadata
:
experiment_log = demo_saving_with_options_folder / "log.jsonl"
logs = [
json.loads(line) for line in experiment_log.read_text().splitlines()
]
logs
[{'event': 'start', 'workflow': 'demo_saving_with_options', 'time': '2024-11-07 16:42:10.877268+00:00', 'input': {'options': [{'filename': 'demo-saving-with-options.input.options.json'}]}}, {'event': 'task_start', 'task': 'sine_plot_with_options', 'time': '2024-11-07 16:42:10.880758+00:00', 'input': {}}, {'event': 'artifact', 'time': '2024-11-07 16:42:10.889125+00:00', 'artifact_name': 'Sine Plot', 'artifact_type': 'Figure', 'artifact_metadata': {'title': 'A sine wave'}, 'artifact_options': {'format': 'jpg'}, 'artifact_files': [{'filename': 'Sine Plot.jpg'}]}, {'event': 'task_end', 'task': 'sine_plot_with_options', 'time': '2024-11-07 16:42:10.957361+00:00', 'output': None}, {'event': 'end', 'workflow': 'demo_saving_with_options', 'time': '2024-11-07 16:42:10.958986+00:00', 'output': None}]
The supported options for saving artifacts depend on the type of artifact. For our matplotlib figure example, the options are forwarded to matplotlib.pyplot.savefig
and are documented in the Matplotlib documentation, with the following changes to the default values:
format
is set to "png" by defaultbbox_inches
is set to "tight" by default
In the same way, the options for a PIL.Image.Image
are forwarded to PIL.Image.Image.save
and are documented in the Pillow documentation with the format defaulting to "PNG". For a numpy.ndarray
the options are forwarded to numpy.save
and are documented in the Numpy documentation with allow_pickle
set to False
by default.
We're done!