Start a new topic

ChatGPT v Orba 1

ChatGPT v Orba 1

Part 1

Around page 22 of the "Orba hacking knowledge base", a year or so ago, me and @Subskybox were dissecting the eventData string the Orba 1 uses to represent sequences. @Subsky did some clever mathematical analysis while I did the donkey work of setting up experiments and recording the results.


Some of the experiments were based on a song called "DPC" which played the first seven notes of a minor scale. I've attached the song file, console output, and a spreadsheet @Subsky put together after analysing the data.

The eventData string is a mix of note and performance data, but this "DPC" test simplifies things to only include note data. This is organised as a series of "note blocks":

Note Block 1:

PlayNote: 16

startTicksLSB: 7

startTicksMSB: 0

Note #: 62

Vel On: 120

Vel Off: 90

DurTicksLSB: -11

DurTicksMSB: 1

Note Block 2:

PlayNote: 16

startTicksLSB: 89

startTicksMSB: 7

Note #: 64

Vel On: 127

Vel Off: 92

DurTicksLSB: -17

DurTicksMSB: 1

Note Block 3:


PlayNote: 16

startTicksLSB: -105

startTicksMSB: 7

Note #: 65

Vel On: 113

Vel Off: 92

DurTicksLSB: -46

DurTicksMSB: 3

Note Block 4:


PlayNote: 16

startTicksLSB: -122

startTicksMSB: 7

Note #: 67

Vel On: 121

Vel Off: 80

DurTicksLSB: -31

DurTicksMSB: 3

Note Block 5:


PlayNote: 16

startTicksLSB: 108

startTicksMSB: 7

Note #: 69

Vel On: 118

Vel Off: 58

DurTicksLSB: -91

DurTicksMSB: 1

Note Block 6:


PlayNote: 16

startTicksLSB: -100

startTicksMSB: 7

Note #: 70

Vel On: 127

Vel Off: 91

DurTicksLSB: -20

DurTicksMSB: 1

Note Block 7:


PlayNote: 16

startTicksLSB: 113

startTicksMSB: 7

Note #: 72

Vel On: 87

Vel Off: 55

DurTicksLSB: 116

DurTicksMSB: 1

If you take this series of values and encode them as a Base64 string, you get the corresponding following eventData string from the .song file:

"EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

This appears in the .song XML as follows:

<LoopData writeIndex="56" recordStartTime="0" recordStopTime="11882" lastEventTime="4809"

nBars="7" eventData="EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE="

eventDataCrc="1ff6d4c4"/>

The problem we found is that the timing data is relative...the timing of each note, ie when it plays, is affected by the one before. That makes real-time quantisation a bit of a nightmare. It might be posisble to implement "offline" quantisation, processing a .song file to quantise the data, or create new sequences based on MIDI data, but it's a hassle and we pretty much abandoned the investigation at that point.
 
A few months later, ChatGPT arrived on the scene...

 

 

 

 

 

 

song
(31.2 KB)
txt
(1.28 KB)
xlsx

1 person likes this idea

Just unboxed a new Orba 2. While they have their problems, I'm still pleased with it. :-)


I copied the loopData from Scotland The Brave into an .artisong and it was recognisable, so that's a promising start.

Here's the latest version of this utility. I was able to download a MIDI file of Gershwin's 3rd Prelude from here:

https://alevy.com/gershwin.htm

...then run "py convert.py prelude3.mid".

This generates a loopData XML entry which can be swapped into a song (Ohm in this case) and plays the track. ("ger3", tempo 50.) 

zip
(21.1 KB)

Thanks for that. Yes, that's a better way to represent the data.

I found I was hitting the Orba 1 note limit with some of the MIDI files I was converting. Someone on the FB group asked if the Orba 2 would provide more capacity for this and I was curious to see if it would, and whether sequence data was represented in the same way, which is one of the reasons I decided to pick one up. Another was to see if CHatGPT might be able to progress the efforts to create a decent system for mapping samples.

I also wanted to see if the synth engine is identical. I dunno, not sure if the synth engine is even based on the same processor, but I presume so. And no-one ever made an editor for drum sounds, so I was curious to look into that as well.

This generated code is nice & clean and avoids conditional negative numbers. It is best to read these values as unsigned ints since they should all be positive values. I had negative numbers from way back because it was easier to understand when transposing a note up(+) or down(-). For songs, we will never need that. This code is also a good starting point to pick out other data structures from Base64 stings like CC values and Pitch Bend data. Most values are in the range 0-127 but Pitch Bend has a bigger range which required 2 bytes. This is why more bytes are required.


1 person likes this

The Output is as expected:

 

[[16, 7, 0, 62, 120, 90, 245, 1], [16, 89, 7, 64, 127, 92, 239, 1], [16, 151, 7, 65, 113, 92, 210, 3], [16, 134, 7, 67, 121, 80, 225, 3], [16, 108, 7, 69, 118, 58, 165, 1], [16, 156, 7, 70, 127, 91, 236, 1], [16, 113, 7, 72, 87, 55, 116, 1]]

 

I decided to play with ChatGPT using examples you've provided and coached it to provide this:


 

import base64
import struct

base64_string = 'EAcAPnha9QMQWQdAf1zvAxCXB0FxXNIFEIYHQ3lQ4QUQbAdFdjqlAxCcB0Z/W+wBEHEHSFc3dAE='

# Decode the Base64 string
decoded_bytes = base64.b64decode(base64_string)

# Convert the decoded bytes to an array of unsigned integers
unsigned_int_array = struct.unpack('B' * len(decoded_bytes), decoded_bytes)

# Group values by 16
grouped_array = []
temp_group = []
for num in unsigned_int_array:
    temp_group.append(num)
    if num == 16 and len(temp_group) == 8:
        grouped_array.append(temp_group)
        temp_group = []

print(grouped_array)

 

Cheers; it's mainly a programming tutorial for me, trying to pick up a bit of Python. Look forward to checking out the stuff you wrote for Orba 2.

These devices have so much potential but I've lost interest in making software for them.. I hope they can make these things stable by the end of the year. Its nice to see your progress :)


1 person likes this

# 32 8 bytes?? Similar to #16

I seem to remember finding that multiples of 16 - 0, 32, act in the same was as 16. I'm not sure what the difference is. I found that 16s started turning into 32s in a sequence where I was playing rapid notes but I'll need to investigate further. 

Just ordered another Orba 2, must be crazy but I want to see how sequence data works on that. ;-) 

I managed to find the Notes I had taken on what I had decoded from song eventData...:


eventData:


# PlayNote

Command (16 is PlayNote but there are likely other for CC values etc..)

Relative Start Tick [LSB] since last (event or note unknown)

Relative Start Tick [MSB]

MIDI Note #

Velocity ON (1-127 follows 7 bit MIDI standard)

Velocity OFF (1-127 follows 7 bit MIDI standard)

Duration in Ticks [LSB]

Duration in Ticks [MSB]


# CC messages require 4 bytes: 

20|23|39?|36?

startTickLSB

startTickMSB

Value


# PB messages appear to be 6 bytes: 

21

startTickLSB

startTickMSB

unused

valueLSB

valueMSB


# ??

37

byte1

byte2



# 32

8 bytes?? Similar to #16


1 person likes this

Just one more to prove it wasn't a fluke...;-)


Scotland The Brave. Tempo should be set to about 70.

Because I haven't nailed nBars yet it might go quiet for a while after you first upload and play it.


song
(36.4 KB)
mid
(4.04 KB)

1 person likes this

(Song file for the start of Maple Leaf Rag attached...needs a slow tempo and fewer bars.)


song

OK...I think this might get things back on track.

Using "midi2.py" instead of "midi1.py" will hopefully handle conventional MIDI files with note-on/note-off sequences. I just downloaded a MIDI of the Maple Leaf Rag, followed the steps above with this new version:

py midi2.py <midifile.mid>
py durations.py
py noteons.py
py dur2orb.py
rename orba_notes -> note_blocks
py rawtostring.py

...etc., and it plays OK. (I only used the first section of the "note_blocks" file as it was pretty long and I'm still not sure what the note limit is.)

I'm interested in taking a look at the MIDI data for the other parts next, see what I can do with drums etc.

mid
(20.6 KB)
py
(1.41 KB)

(The main stumbling block here for anyone else whos' remotely interested will be that stupid MIDI file format which replaces note-off with note-on vel 0. Unfortunately I didn't realise that Sonar was doing that until I was fairly well into proceedings. midi1.py can't handle the conventional format, so I'll need to revisit that at some point.)

tldr:

py midi1.py <midifile.mid>
py durations.py
py noteons.py
py dur2orb.py
rename orba_notes -> note_blocks
py rawtostring.py

Copy a song file and replace eventData along with lastEventTime, recordStopTime, writeIndex, nBars, calculated as above. (After getting creating the note_blocks.txt file, I run decodeblocks.py to report the "console output" summary with all the absolute times, ntes, durations, etc., as a reality check to see if it looks sensible, and to get the numbers for calculating those additional XML values.)
  

py
txt
(4.83 KB)
Login or Signup to post a comment