Start a new topic

ChatGPT v Orba 1

ChatGPT v Orba 1

Part 1

Around page 22 of the "Orba hacking knowledge base", a year or so ago, me and @Subskybox were dissecting the eventData string the Orba 1 uses to represent sequences. @Subsky did some clever mathematical analysis while I did the donkey work of setting up experiments and recording the results.


Some of the experiments were based on a song called "DPC" which played the first seven notes of a minor scale. I've attached the song file, console output, and a spreadsheet @Subsky put together after analysing the data.

The eventData string is a mix of note and performance data, but this "DPC" test simplifies things to only include note data. This is organised as a series of "note blocks":

Note Block 1:

PlayNote: 16

startTicksLSB: 7

startTicksMSB: 0

Note #: 62

Vel On: 120

Vel Off: 90

DurTicksLSB: -11

DurTicksMSB: 1

Note Block 2:

PlayNote: 16

startTicksLSB: 89

startTicksMSB: 7

Note #: 64

Vel On: 127

Vel Off: 92

DurTicksLSB: -17

DurTicksMSB: 1

Note Block 3:


PlayNote: 16

startTicksLSB: -105

startTicksMSB: 7

Note #: 65

Vel On: 113

Vel Off: 92

DurTicksLSB: -46

DurTicksMSB: 3

Note Block 4:


PlayNote: 16

startTicksLSB: -122

startTicksMSB: 7

Note #: 67

Vel On: 121

Vel Off: 80

DurTicksLSB: -31

DurTicksMSB: 3

Note Block 5:


PlayNote: 16

startTicksLSB: 108

startTicksMSB: 7

Note #: 69

Vel On: 118

Vel Off: 58

DurTicksLSB: -91

DurTicksMSB: 1

Note Block 6:


PlayNote: 16

startTicksLSB: -100

startTicksMSB: 7

Note #: 70

Vel On: 127

Vel Off: 91

DurTicksLSB: -20

DurTicksMSB: 1

Note Block 7:


PlayNote: 16

startTicksLSB: 113

startTicksMSB: 7

Note #: 72

Vel On: 87

Vel Off: 55

DurTicksLSB: 116

DurTicksMSB: 1

If you take this series of values and encode them as a Base64 string, you get the corresponding following eventData string from the .song file:

"EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

This appears in the .song XML as follows:

<LoopData writeIndex="56" recordStartTime="0" recordStopTime="11882" lastEventTime="4809"

nBars="7" eventData="EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

eventDataCrc="1ff6d4c4"/>

The problem we found is that the timing data is relative...the timing of each note, ie when it plays, is affected by the one before. That makes real-time quantisation a bit of a nightmare. It might be posisble to implement "offline" quantisation, processing a .song file to quantise the data, or create new sequences based on MIDI data, but it's a hassle and we pretty much abandoned the investigation at that point.
 
A few months later, ChatGPT arrived on the scene...

 

 

 

 

 

 

song
(31.2 KB)
txt
(1.28 KB)
xlsx

1 person likes this idea

More attachments...


py
(1.13 KB)
py
(807 Bytes)
py
(788 Bytes)

Corrected video link.

https://youtu.be/OGS634SKncE


Replacing the MIDI notes in an existing sequence in this way is quite straightforward, and you can use the same approach to replace other data, timing, duration, velocity, etc. I'm currently trying to figure out how to generate an eventData string like this from scratch along with other relevant XML data, rather than by manipulating a 'dummy' string.

Here's a new test.

Song "Seq" has eight notes.

Loopdata is:

<LoopData writeIndex="64" recordStartTime="0" recordStopTime="6857" lastEventTime="5902"

nBars="4" eventData="EAoASmUpqQAQgANKTTCnABDjA0UzJLoAEJIDRTwnxQAQ4gNBbSzXABCyA0FwLK0AEL8DPlc2vAAQxAM+djqzAA=="

eventDataCrc="e91384a5"/>

stringtoraw.py:

16,10,0,74,101,41,169,0

16,128,3,74,77,48,167,0

16,227,3,69,51,36,186,0

16,146,3,69,60,39,197,0

16,226,3,65,109,44,215,0

16,178,3,65,112,44,173,0

16,191,3,62,87,54,188,0

16,196,3,62,118,58,179,0

decodeblocks.py:

MemIdx = 0 - MIDI Note at tick 10, channel 1, note 74, duration 169, von 101, voff 41

MemIdx = 8 - MIDI Note at tick 906, channel 1, note 74, duration 167, von 77, voff 48

MemIdx = 16 - MIDI Note at tick 1901, channel 1, note 69, duration 186, von 51, voff 36

MemIdx = 24 - MIDI Note at tick 2815, channel 1, note 69, duration 197, von 60, voff 39

MemIdx = 32 - MIDI Note at tick 3809, channel 1, note 65, duration 215, von 109, voff 44

MemIdx = 40 - MIDI Note at tick 4755, channel 1, note 65, duration 173, von 112, voff 44

MemIdx = 48 - MIDI Note at tick 5714, channel 1, note 62, duration 188, von 87, voff 54

MemIdx = 56 - MIDI Note at tick 6678, channel 1, note 62, duration 179, von 118, voff 58

This attempts to generate simulated "console data" from the Orba's built-in debug utility.

The real "console data" is:

         Using 64 of 4096 bytes of looper memory available (1 %)


         Looper Configuration:

                loopBarEndTolerance: 120

                beatLengthTicks: 480

                notesPerBar: 4

                quantMode: 0

                quantStartSnapTicks: 120

                quantBarEndSnapTicks: 240

                allowRecordingSilenceStart: 1

                allowRecordingSilenceEnd: 0

                thinnerMode: 1

                cc_error_limit: 1000

                pbend_error_limit: 25000

                ccA: 1

                ccB: 74

                ccC: 75

                thickenerMode: 1

                thickenerEmitPrd: 20

                thickenerMaxDt: 5000

                noteStartWindow: 240


MemIdx = 0 - MIDI Note at tick 10, channel 1, note 74, duration 169, von 101, voff 41

MemIdx = 8 - MIDI Note at tick 906, channel 1, note 74, duration 167, von 77, voff 48

MemIdx = 16 - MIDI Note at tick 1901, channel 1, note 69, duration 186, von 51, voff 36

MemIdx = 24 - MIDI Note at tick 2815, channel 1, note 69, duration 197, von 60, voff 39

MemIdx = 32 - MIDI Note at tick 3809, channel 1, note 65, duration 215, von 109, voff 44

MemIdx = 40 - MIDI Note at tick 4755, channel 1, note 65, duration 173, von 112, voff 44

MemIdx = 48 - MIDI Note at tick 5714, channel 1, note 62, duration 188, von 87, voff 54

MemIdx = 56 - MIDI Note at tick 6678, channel 1, note 62, duration 179, von 118, voff 58

 
...so that matches OK. 

Considering the rest of the XML:

LoopData writeIndex="64"
recordStartTime="0"
recordStopTime="6857"
lastEventTime="5902"

nBars="4"

Where does that come from, I wonder. Eg that figure of 5902 for lastEventTime. The last note is at 6678.  

py
song
(31.5 KB)

5902 = 5714+188. That's actually the time plus duration of the penultimate note. Curious.

The latest routines just enabled me to convert a polyphonic MIDI file of Für Elise into Orba XML and it played first time, no dummy data required - pleased with that. As well as writing the code, ChatGPT has been making helpful observations, such as identifying how the essential "writeIndex" value is calculated in the XML - it's simply the number of values in the note blocks file.

https://youtu.be/G-4zJExTIN0

Login or Signup to post a comment