Back

2024 05 31

Ouch this guy actually has experience with some bad internet speeds. Basically the same as what I deal with when it gets really windy and the power lines are swaying. Seems like even a slow speed ham data connection would help.

๐Ÿธ Ribbit

Continuing some more CSS tonight as well as getting into connecting up some basic functionality. Interestingly, an old iPhone reports innerWidth as 1960 so fonts end up tiny. The newer much larger iPhone reports 980, and my desktop reports 960. Making a font know it's on a small device with an awkwardly high screen resolution isn't obvious. So to get something readable, where the UI font matches the size rendered in the chat window on the iPhone 7 requires the font to be something like 48pt. Where as on the iPhone Xs Max the font is set to 16pt. Wacky. I wonder if there's a way to get the inches or cm reported from the device.

sending and receiving messages

Got the new UI sending and receiving messages. I need to add the options page to enter a callsign and a location, but I'll do that tomorrow. The update can be found here. There are some bugs on windows, but its working on osX, android and iOS/safari. On the desktop the fonts get kinda big on osX, but that might be my browser settings. I'll have to check that out tomorrow.

The options page for the callsign and phone number for unique identification. A button to update GPS coordinates and gridsquare. Depending on platform a way to switch microphone and speaker devices. I'll also add in a +- for font size settings as well as a light and dark mode. So many options.

so many options

2024 05 29-30

Busy busy busy.

๐Ÿธ Ribbit

Managed to greatly reduce the CSS to build the UI.

simple CSS

Managed to get the chat panel's message boxes down to 32 lines of CSS with each message which requires a sender div, with a name and time. The contents are just a series of paragraph elements. With the simplification the UI reacts well to different screen sizes and the chat window can be resized to fit the screen.

2024 05 24-28

Happy Memorial Day ๐Ÿ– weekend! Lots of cars washed and ready for the summer, and some CSS. Also it's about time I was productive on side projects again. The backyard is mostly in order, still some broken fences to repair. But the garden has just about everything planted. I'll be back at it tonight.

๐Ÿธ Ribbit

Working out some CSS for a basic UI.

Working from a template worked out over chat. Making sure everything works out responsively to screen sizes and remains readable is always a lot of work. Seeing as this is a chat app where everything needs to be readable makes this a bit more difficult. I also need to test across all the different devices I have here to make sure that the fonts are readable and seem to be the same size on different screens. So much packaging.

packages

2024 05 19-23

So busy with work! Haven't had a lot of time to put toward any personal projects. I'll be back at it this weekend with the extra day off from work.

2024 05 16-18

Figuring out media devices on safari. Also, why is the "body" tag on this page acting strange?

๐Ÿธ Ribbit

Why does apple need to make everything more fussy. Getting a list of attached media devices on everything else other than an apple product just works.

The first thing the app needs to do is ask for permission to access the media devices. This is done with the getUserMedia function. Once the app has permission to access the devices it can then enumerate the devices with the enumerateDevices function. This will return a list of devices that the browser can access. The app can then filter the list to get the devices that are audio devices. The app can then create a list of devices that are input devices and a list of devices that are output devices. The app can then create a select element for each list of devices and add the devices to the select element. The app can then add the select elements to the page.

2024 05 13-15

Some extra time required for work, needed to fix some stuff up for presentation soon.

2024 05 11-12

All day at IKEA...

๐Ÿธ Ribbit

Will continue UI work tonight.

2024 05 09-10

Design update.

๐Ÿธ Ribbit

Going to do some work to update the UI this weekend.

2024 05 07-08

reading minds isn't possible.

๐Ÿธ Ribbit

So there's a UI that i need to build to, but there's no reference as to what it looks like. So I need to wait till there's something to refer to before continuing.

2024 05 05-06

Found some great Chicken 65 and pondering how to attract new ham radio operators.

๐Ÿธ Ribbit

Set up the wasm module in the new simplified UI and building a nice UI to select audio input and output devices. By default the browser will select the first input and output device available. On the PC it's likely that there will be more than one io in a list of enumerable devices.
For that reason a simple Enable Audio button should show a list of available devices to select from if there's more than one option available.
Note to self; don't put wasm code into a type="module" as the scope from the wasm module import script will be hidden from code living inside of a module type script node. This means that strict js code will not be enforced and minor issues with formatting might be ignored by some browsers.

Looking at a nice setup to display a list of devices for the browser to access. Here's a simple example of how to get a list of audio devices available to the browser.

2024 05 04

After a break, it's time to get back into some fun.

๐Ÿธ Ribbit

Working on more UI prototypes.

Some basic functionality for the UI being created. Theme wise I'll leave in a bunch of options to enable a lot of customization; light/dark themes, custom color pallettes for expert users etc. This is all simple CSS so it should be easy to pickup on how to make changes.

2024 04 26-28

Additional work on tertiary projects.

2024 04 25

Happy friday. And some tertiary projects need attention tonight.

The Other things...

Doing some art tonight for a change of pace.

2024 04 24

Happy birthday Gugliermo Marconi! Seems like nice weather keeps me from getting anything on my side projects done. But tonight it's cold and gloomy out, so I guess that means I'll be able to focus on getting something done tonight.

๐Ÿธ Ribbit

fiddling with some wasm loading simplifications.

2024 04 23

Digging through code and getting geo location information. and I need to get some icons updated for an unrelated project.

๐Ÿธ Ribbit

So I'm going through some geo location code from the ribbit pwa test by k0ozk, hoping to get a grid square from the GPS position. From the coordinates I found some code that would translate that into a ham grid square, like CM97 for example. Later tonight I'll also build a template for loading in the ribbit.wasm into a web page without the emscripten template to make the package more portable.

2024 04 19-22

Getting some work done on some other projects, something fun to read for programmers, and war of the worlds in CW.
A nice directory listing extension for servers. And some amazing volumetric capture.
I've noticed that as the weather has gotten really nice I'm finding less time to work on side projects.

2024 04 16-18

Busy with work, and house stuff. I'll be back at it tonight.

2024 04 15

Taxes, and CW.

2024 04 11-14

Birthday stuff and gardening.

2024 04 10

Ham2K Portable Logger has an interesting interface worth looking at. In particular the input area above the keyboard on mobile has some very nice features. I should look at how I could do some of the same things on mobile. Also a good CWFont is something I'd like to build.

CW-Font

The pipeline for generating a font might be building something in an actual editor like Affinity Design, export a SVG, clean up the svg manually then importing each letter into Font Forge. I've tried to use the built in tools in Font Forge, but they lack the precision I'd like to have. I think I'll need to build a font from scratch to get the results I want.

2024 04 08-09

Not completely sure how I managed to get through all of the steps to post a build to the google play account, but I managed to upload an application bundle to the google play store to maintain the developer status for ORI.

๐Ÿธ Ribbit

By posting the updated application bundle the developer status should remain intact. I believe the action of posting a new version of the app should fulfill the requirements for google. To get this to work involved some certificate signing and uploading zips and all sorts of creating and generating passwords. Quite a bit of guess work as a lot of the documentation seems to be quite out of date and a lot of the UI for android studio has deviated from the documentation. Sadly, I think the apple XCode process of building and posting an app seems to be more coordinated.

2024 04 07

Helping at the art gallery in downtown. A new season of anime has started, lots of fun new shows, and lots to mess with on the UI.

๐Ÿธ Ribbit

As a side project I thought it would be nice to come up with a fun to write a simple callsign to Identicon generator. Also doing some work to make the UI reactive for mobile, this includes collapsing some views so they're not cramped on a small screen.

2024 04 06

Can Ribbit be adapted to operating a BBS? this video might lead to some answers.

๐Ÿธ Ribbit

Continuing work on the CSS for the UI tonight. Lots of optimizations and refinements.

revision2

Adding in places to display various bits of user data. Adding in some color variations to help different areas to pop out. It's all very minimal for now, but as I get more of the features in place I'll be able to add more color and more features. I'll also need to consider how parts of the layout collapse and make room for small mobile screens and landscape views.

2024 04 05

So, Bing Co-Pilot is pretty handy, I asked how to convert a GPS coordinate to a GridSquare. This is what it replied:

Certainly! To convert GPS coordinates to a **Maidenhead Grid Square**, you can follow these steps:

1. **Longitude Calculation**:
- Add **180** to the longitude value.
- Divide the result by **20** and add **1**.
- Determine the corresponding letter of the alphabet (usually in uppercase).
- For example, if your longitude is **32.123 W**, the calculation would be:
- \(147.877 / 20 = 7\). Adding one gives the 8th letter of the alphabet, which is **H**.
- The remainder is **7.877**. Divide this by **2** (no conversion required), resulting in **3**.
- The remaining value is **1.877**. Multiply it by **12** and add **1**, rounding down to the nearest integer. This gives the letter **w**.

2. **Latitude Calculation**:
- Add **90** to the latitude value.
- Divide the result by **10** and add **1**.
- Determine the corresponding letter of the alphabet (usually in uppercase).
- For example, if your latitude is **14.321 N**, the calculation would be:
- \(104.321 / 10 = 10\). Adding one gives the 11th letter of the alphabet, which is **K**.
- The remainder is **4.321**. Divide this by **24**, rounding down to the nearest integer. This gives the letter **h**.

3. **Combine the Pairs**:
- Alternate between longitude and latitude pairs, starting with longitude.
- Using the example values, the grid square for **32.123 W, 14.321 N** would be **HK34wh**.
If you prefer not to do the math manually, there are online tools and apps available that can convert GPS coordinates to Maidenhead Grid Squares. For instance, you can use the **Amateur Radio Grid Square Tool** available on Google Playยฒ. Happy ham radio operating! ๐Ÿ“ป๐ŸŒ

Source: Conversation with Bing, 4/5/2024
(1) Amateur Radio Grid Square Tool - Apps on Google Play. https://play.google.com/store/apps/details?id=appinventor.ai_izzybella419.AmateurHamRadioGridSquareCalculator.
(2) How can one convert from Lat/Long to Grid Square?. https://ham.stackexchange.com/questions/221/how-can-one-convert-from-lat-long-to-grid-square.
(3) Gridsquares - Amateur Radio Wiki - HamTools. http://wiki.hamtools.org/index.php?title=Gridsquares.
(4) Grid Locator - ARRL. https://www.arrl.org/grid-locator.

This is pretty cool. I hope it works ๐Ÿ˜€, I'll check on it tonight.

2024 04 03-04

More rain this weekend, but my running is getting back on track.

๐Ÿธ Ribbit

Still doing some preliminary work on the CSS for a general chat window. Modeling it a bit after apps like discord, slack, google chat and some others. Looking at features from apps like HAMRS for logging features. A lot of the features can be integrated into a web app with a chat window and made friendly for mobile. I'll start posting images once I start to figure out where everything should go.

screenshot20240405

Starting to do some layouts, no color schemes applied yet just to get a layout composition.

2024 04 02

The weather today is very nice. Had to spend a bit of time outside in the sun.

๐Ÿธ Ribbit

Going to work a bit on the UI, try to match some of the things I see in other chat apps. Timestamps, names, messages etc.

2024 04 01

Happy April fools, I'm not going to do anything silly today. I'm going to get some work done.

๐Ÿธ Ribbit

Going to put together some CSS layouts for the chat window in a little test dummy case.

2024 03 31

Happy Easter!

๐Ÿธ Ribbit

Built up a template for a Portable Web App, just an empty shell at the moment, just starting to load in all the minimal parts to make a web page that tells the browser there's a thing to install.

He Is Risen

2024 03 28-29

Looking forward to a productive weekend.

๐Ÿธ Ribbit

Going to do a bit of work to add a message header with callsign, GPS and tx timestamp. Then I'll use the browsers localDatabase to store messages, leaving them as text in a div looses history if you re-open the app. With the DB you'll also enable searching/sorting and some persistance which will allow caching of messages so the history window doesn't have to storeeverything you've sent/received. So i'll limit the text window to something like 100 messages and past that you'll do a local read from the DB to populate the window with older messages. Otherwise the window would end up being rather huge which will hurt performance.

indexedDB

So after some fiddling I figured out that some variables will only be initalized with the onupgradeneeded callback from the opening of a database. This is of course only called if either you've deleted an old database, or incremented the version number used for the database.

2024 03 26-27

Working on getting back into the flow.

๐Ÿช„ Code Witches

So there are a lot of repeated patterns in the code for listening for keys and reacting to them for different setups. Battles, dialog, navigation etc. Building a generic "Menu" system, something I've intended to do once I figured out the pattern that all of the different systems seem to follow. So there's a view of some kind that waits for different input events. The input from the Menu should trigger those events. I'm not sure how I want to trigger the events. Of course there's the usual `CustomEvent` but then I'd have a lot of listeners, and there's a chance that a listener may trigger for an event that's no longer needed. ๐Ÿค” still a lot to work out to make sure this will work as intended for the different situations in the game.

2024 03 23-25

A day of picking weeds, going to the gym, bottled water, and house chores.

๐Ÿช„Code Witches

Doing some work to make some of the dialog system a bit easier to extend. Will consider setting up a json file that is the different "scenes" with characters/dialog and choices etc. This will make it easier to add in new content and make the game more dynamic.

2024 03 19-21

Updates and responsibilities; last night was a trip up to the city for a dinner with the client. Haven't had a whole lot of time to get anything done. I'll be back at it tonight.

๐Ÿธ Ribbit

Yay, so i got the new buffering setup put into a clean build and it's decoding very reliably; very exciting. Now I should do a bit of work to make a basic web page available. First this will be a super simple single page that acts the same as the hello world ribbit app so I can have some people test that the system is working on their devices. I can also do a bit of testing on lambda or something as well.

yay!

So the decoding works well, no weird glitching. Next I should look at putting together the rest of a usable UI, but I should also look at streamlining the js file that interfaces with the module since there's not a whole lot of the debug necessary now that it's working. Check it out here. From my testing i've run it on ios/safari on a iPhoneXL, an android Pixel3 a cheap T-Mobile phone and in windows on desktop in edge and brave.

2024 03 18

Happy Monday, bought seeds for the garden ๐ŸŒฑ. I should do a few more hours of work on one of the other projects before getting back to ribbit tonight. Also, last night the XYL and I were talking about a "Ham Cafe" a coffee shop/sandwich shop with an antenna farm where each table would have several coaxial cables to play with your radio while sipping a latte and eating a ham sandwich.

๐Ÿธ Ribbit

Also should update the APK on google's play store tonight. Downloading and updating some Gradle related things. I feel like every time I open Android Studio, I have to download another dozen GBs of updates just to build the app.

Looks like the updates Google has decided to make devs go through involve filling in many questionairs and forms. Uploading images, screen shots, icons, and agreeing to a bunch of terms and conditions. Also some encryption keys for signed app bundles. I'm not sure if I'm going to be able to get this done tonight.

Getting stuck trying to update the cert for uploading a signed version of the app. Not sure if I can get the passwords/private key for the previous version, so I'm trying to make a new cert. In the process my version of java doesn't have the encryption type expected by google ๐Ÿ˜”.

Somewhere along the way I need something to do with all of this cert signing. I'm not much of an app dev, since I spend most of my time in the web end of things. I'm not sure if I'm going to get too far without some additional information. I'll have to ask around in the slack tomorrow.

2024 03 16-17

Other side projects took some prescidence last night. Had to build some more visuals, but I think tonight I'll make it a point to get the updated buffering scheme into ribbit. Or maybe not tonight. Went out to do a bit of St. Patties day stuff. I'll be back at it tomorrow.

2024 03 15

Another busy weekend ahead. Some additional work on side projects. And I have more ideas for the overflow buffer.

๐Ÿธ Ribbit

So the javascript arrays are too slow for the framerate of the audio processor, so I think that means that the overflow needs to live in the wasm module. So tonight I'll make a buffer that gets written to with a different value to indicate its size. Since wasm module memory is a fixed size i need to tell the module how many values were written to the overflow so when the next "read" begins it knows how much to read from the overflow before reading from the incoming audio values. This entire 160/256 buffer thing is rather tricky.
This would be avoided if either:
a) javascript would actually clear buffers when asked as quickly as it can be asked of it.
b) wasm memory were more dynamic and had more utility functions available for arrays like split/push/pop etc.
In anycase, there's still some bug fixes required before moving on to building the rest of the ribbit app.
On a slightly related note; Ive been asked to help build a discord bot for a different ios app, which leads me into wanting to make a ribbit discord bot as well. So many things! The last day was the 8th (last week) where I was able to spend any amount of time working on ribbit, hoping once things settle down a bit I'll be able to focus on it some more soon.
Also a frustrating side effect of debugging is that this slows down the virtual machine in a way where the buffer has enough time to properly clear.

Modifications changing the feed length to 2048 to match the input buffer size, when we fill the buffer with multiples of 160, we tell the decoder to read the memory and it'll take care of the overflow, and clear the memory. I think i might just do an isolated test in an isolated module to see how to manage that, this would be similar to how I tested this in javascript last week.

Of course, it's late, but I did get the cpp version of the javascript array swapping to work.

The overflow gets fed values that didn't make it to the decoder. Then the next round the decoder starts off with what was left over.

So according to CoPilot, this is a common sort of function.

Of course it's new to me since i've never needed to do this sort of thing in cpp before. But if this is a good example then I'll leave it as is.

Here's the result feeding into the audio context at 8000hz with a feed length of 2048 and chunks of 160 floats. The overflow buffer lives in the wasm module memory space and the feed isn't updated very often so there's plenty of time for the processor to work on the data. Also there aren't any javascript arrays being used so the virtual machine never need to get bogged down by the buffer clearing. I think this is a good solution. I'll have to integrate it into the main app and see how it works with the rest of the code, but that comes tomorrow.

2024 03 14

Continuing some work on some additional side projects for a couple more nights. Then I'll be back on some of the more interesting stuff. Also Happy Pi day! ๐Ÿฅง

๐Ÿธ Ribbit

I did a little bit on ribbit trying to figure out the buffer issue. It's looking like the overflow buffer between updates isn't getting cleared properly. I think this is a limitation on the speed of javascript in the browser not being able to keep up with the audio processing frame rate. Experimentation this weekend to come.

2024 03 13

Some art work is required for an unrelated project, so I need to spend a few hours on that. Then I'll be back on debugging the decoder.

2024 03 12

Reading into how the decoder works, and how the preamble works. Also some other projects for some friends.

2024 03 11

Meetings, meetings, and more meetings. The expectation for getting things done should be reduced based on the number of meetings one has to attend. If in an 8 hour day, you have 4 hours of meetings. Don't expect productivity to be anywhere near 50%.

๐Ÿธ Ribbit

I should add some form of visual debug to the values going into and out of the decoder. I think this is why the spectrum display is available, but there really should be some form of spectrum that operates at 8khz, not 48khz.

2024 03 10

Additional projects, yardwork, car batteries, fence repair, etc. I think tonight I just need to zone out and not do a bunch. Been at this non stop for a while, time for a bit of nothing.

2024 03 09

Household errands, yardwork, a ๐ŸŒฏ and โ˜•.

๐Ÿธ Ribbit

Working a bit on making the UI useful for debugging. Also, I need to make the UI work on little handheld iphone devices. Right now the text is tiny, which means the buttons are tiny. I didn't limit scrolling/scaling so you can zoom in and press buttons, far from optimal though.

Things are more or less working, Though the decode bug seems to be holding things up. So i'll be working on that. I think it's got more to do with the fetch than the decoding. If the decoding wasn't working proerply then I doubt I'd be getting garbage out since it's working. So I believe that there's something going on with the message buffer being incorrectly filled.
A great thing though is that the wasm is working nicely on a really old device. That's a good sign.

2024 03 08

Happy Friday! Time to order food, relax a bit, remember Toriyama, and get some work done.

๐Ÿธ Ribbit

Looking forward to integrating the encoder and decoder into the same page. Both the encoder and decoder are working in isolation, to have them work at the same time I suspect that I'll need to have the audio decoding pause while the encoded message is transmitted. Otherwise im sure the decoder will hear the signal and decode it. Some "chat" window should be made and the decoded message should look for blank characters aka tofu๐Ÿ”– and remove them.

A naive implementation by just copy/paste of code. On it's surface it's working, but every time I press the encode button I create a new audio context. The interesting thing behind this is the weirdness that the fetchDecoder returns that garbage data I had seen last night. So there's a bit of a clue as to the cause of that.
So i'll need to maintain a single context for tx.

Investigating a more minimal enscripten-core html file without the canvas or output field. Something I can customize with more freedom. I might also move the code into a local git repo. That way I can make mistakes and backtrack.
There's also something more joyously simple about webdev + javascript. The magic of the browser means I just copy some files over, and it's live. I don't have to worry about the build process, or the deployment process. The files simply work.
For this reason, I avoid things like node vue and electron etc. These environments require envrionments, a whole setup around them that has to be installed before anything works ๐Ÿšฎ. I mean seriously, devs have been working for years to get away from that, only to have people who want to be clever just make that all happen again. Oh, this time it's different, we're making package management the right way. No, no you're not. There's no right way to manage libs, or have a environments, virtual machines, or any of that.
It's ALL awful.
rant

So I just copied the relevant files into a new directory and checked it in.YAY 0 dependencies, no node, no vue, just html and javascript, no make files, no build.bat, no node run configs etc.

Stripped down to the bare bones, now to clean up the javascript to a minimal state.

A very basic setup with really simple css to get started. Trying to figure out what sort of features will be needed. The UI will reflect on the code and thus the code will be written to reflect the UI. It's a simple chat window.

Still getting some garbage decoded. interestingly though this was able to decode more than one signal at once, sort of.

2024 03 07

Implemented the things, and don't forget to unmute your mic ๐ŸŽค.

๐Ÿธ Ribbit

The array swapping thing required some unusual creation of empty arrays. For some reason overflow.length = 0 simply wouldn't erase the previous content of the array. I have a feeling that there was something holding onto the memory disallowing it from being cleared before more elements were added to it ๐Ÿชฒ. For testing I left a looping audio recording of the helloworld wav and the decoder didn't have any overflow or underflow issues.

The one issue is that randomly the decoded text comes back with garbage. It's difficult to tell where this one is originating, so I'll need some way to step through the cpp before being able to figure this one out. But as a general accomplishment the audio feed into the decoder is stable and the buffer is being managed properly.

After messing with the sample size, setting it to 2048, the decoded message got a lot more stable. If the response is -1, then the decoded message is garbage. I had to interfere with the mic to get a bad decode. The result from the fetch decoder function returns a positive number when the message decoding is successful, and a negative value when it's not. I'm guessing the numbers above 0 refer to the number of flipped bits during decoding.

The above shows the app running on a phone from a web browser. I guess next I should integrate the encoder with the decoder in the same page. I should also start writing an actual html template rather than use the emscripten test template.

Also Akira Toriyama passed away from something that sounds like a blood clot ๐Ÿ˜ข. I grew up on Dr. Slump and Dragon Ball, so it's a sad to see him go.

2024 03 06

Solutions forthcoming.

๐Ÿธ Ribbit

Ok, so i wrote a little test case on how to deal with buffers of differing size. Of course this only iterates a few times, where in practice the flow will be continuous. But the setup should work. I also wanted to make sure that this would work with different sized sample buffer sizes.
In the documentation for WebAudio's API the buffer size is set based on the performance of the device it's running on. So it's very possible that the buffer could deviate from 2048.

At the bottom I write a little cycle to iterate through an array that's one size, and feed it to the iterator that deals with sending values to another array of a smaller size. The leftover values that were not sent to the feed are kept around and the new input is added in after. This scheme seems to work, but I think there are som possible ways to optimize it. I'll need to test it with the decoder to see if it's a good solution.

By feeding it the actual array sizes of 256 into 160, i get an interesting pattern.

The colored areas represent arrays that would be fed to the decoder. A solid horizontal line shows the end of an iteration. So you can see the first iteration at the top there's only one array that would be passed to the decoder. Then the next iteration the decoder would be fed twice.

2024 03 05

This feels like a programming test.

๐Ÿธ Ribbit

  1. Make a routine that will take a sequence of values from 0 to infinity.
  2. Contiuously fill an array that's 160 elements long from that's 256 elements long.

Iterating a lot on different variations of this buffer solution tonight.

2024 03 04

Some indian food, and change of pace.

๐Ÿช„ Code Witches

Added in a text box at the bottom of the page view for keyboard input. Mostly for debugging and testing the game engine. The input events have special handling setups because the main page has various interrupts for handling touch to trigger the CW input. More to come on that later.

๐Ÿธ Ribbit

Tonight I'll setup some schemes to deal with leftover samples between the audio processing updates to handle the missmatch in the buffer sizes between the ribbit audio feed and the audio process node. There should be something stable, reliable and elegant, I just don't know what that is yet.

Progress

02.01 began messing with Emscripten and figuring out how to build a webAssembly module's interface with javascript. Took in the DSP code from the Ribbit app and started integrations and figuring out how to get data into and out of the DSP.
02.14 Able to figure out how to get an audio signal piped into the spectrum/waveform.
02.15 Working on getting the data into the decoder.
02.23 I started the encoding consuming a text input field and returning an audio signal.
02.26 Produced a wav file from encoded text.
03.03 Major breakthrough on decoding incoming audio to text.

A little over a month and I have the basic parts of encoding and decoding in some form of working condition. In the next week or so I'm hoping to clean up the code and minimize the clutter to prepare for building the rest of the app. I'll also start building some of the XMPP components in a different environment. And I'll also look at prototyping some more UI designs with with logging and other options in mind.

2024 03 03

Cleanups and organizations before continuing. I need to understand exactly how I got everything working.

2024 03 02

A short day of errands.

๐Ÿธ Ribbit

Encoding the audio output from the encoder into a blob isn't as straight forward as I would have expected.

How about we just parse a bunch of numbers separated by a comma and a space. Sure this is like 250k of text, but it's readable by a human. And it's what was produced by the encoder. In theory, this should be what the decoder is expecting.

Fascinating, there seems to be something happening! FETCH DECODER was called!

Ah yes, the first decoded signal! Well, the signal is completely artificial, but the decoder is working. ๐ŸŽ‰ This is a pretty good step forward, actually. The Uint8Array in the module's memory was populated by the decoder with the Hello World! message and the fetchDecoded() function in the module told the javascript to retrieve the message from the buffer. The round trip is now complete!
The issue seems to be the WebAuioAPI.

Last night when I looked at the float values that were decoded from the wav file, The first 12+ samples were above 0 before any negative values. In the Audacity, the audio editor, there were only 5 data points in the audio file that were above 0. I trust audacity to show only what was decoded in the audio file's samples.
This means that the Web Audio's interface to extract sample data from the wav file was "filling in" more sampled values than were actually in the audio file.
What to do next.

Regardless of how the WebAudio interface works, i need to figure out how to ACTUALLY resample the audio to 8000hz. It's very likely that the web audio interface pretends it's working at that rate, but the data fully shows that the browser is ignoring those settings. That being the case at 48000hz I'll need to skip 5 samples before feeding the decoder a value. On the other hand it might be worth while to average every 5 samples before feeding it to the decoder. More experimentation is needed.
One thing though, If one of the samples before the 6th one that's sent to the decoder had important information, then it would be lost. So I think the best thing to do is to merge the samples together and normalize the values before they're sent to the decoder. I can try using a 48000hz set of floats as a base to down-sample, and I believe that would be the buffer from the decoded wav file.

The details are merged together as the sample width smears the audio together. Actually quite an interesting spectrogram when resampled to 48khz above. The 8khz version is below. So I need to figure out a good way to mix together samples before feeding it to the decoder.
At the moment, there are 16384 samples of audio coming out of the encoder, or 2.048 seconds of buzz. So i need to write a simple wav to float->text converter to start my mixing tests.

Converting the wav to a 48khz list of float values is certainly a longer file.

The 48khz version of floats is on the left, there's about 6x more 0s at the end of the data than the shorter 8khz version. The mini map on the right edge of the larger text is a better indication than the main panel.

The super naive approach works. Where i just skip every 6th sample from the 48000 version and feed it to the decoder. So this more or less proves there's something going on with the audioCtx feeding the decoder garbage data in some way.

        const response = await fetch('./helloworld48khz.txt');
        // read text from response
        const text = await response.text();

        // parse text into a Float32Array
        const array = new Float32Array(text.split(',').map(Number));

        // populate the feed array with every 6th value
        for (let i = 0; i < array.length; i += 6) {
            feed[i / 6] = array[i];
        }
        
        Module._feedDecoder(FEED_POINTER, FEED_LENGTH);
    

You can see the for loop where the float array is just put into the feed array, the feed is the array in the module's memory that the function _feedDecoder reads from.
The one big proof here is that i can use 48000hz sampled data and pipe some of the content to the decoder and get a result.
So the next step is to load the WAV file and see if i can feed every 6th audio sample, or perhaps calculate a division of the audio stream to feed the audio only 8000 samples per second in a 48khz context.

๐Ÿธ๐ŸŽ‰ Success! so messing with various parameters, I finally got a response out of the decoder!๐Ÿ“ปโšก
After fiddling, I was able to get the audio context to obey the 8000hz sample rate and feed that to a small buffer. There are a LOT of issues here with the decoding matching up with a window of time around a buffer filling up and starting over. So the decoder doesn't always have a chance to get all of a signal, but this is a good place to work from.

2024 03 01

Simplify, that's the goal this weekend. It's been a busy week for work. Starting into the meat of a new project so a lot of my mental energy has been spent on that. So I'll be making a bit slower progress on some of the side projects till the weather gets a bit more warm and I have more energy. Speaking of which I'm low on caffeine right now. โ˜•

๐Ÿธ Ribbit

So I think i'll start another fork of the example.cc into a decode.cc and use a function in the module to push 160 floats in at a time. Then when it's been fed,i'll get a callback to know to feed it the next 160 floats. I think that's the best way to go about it.

I'm making the assumtion that the decoder is not time dependent, or rather so long as it's getting values in order and they're usable values so long as they add up to a decodable signal a message should be produced.
I'm fairly certain that the array is being fed to the decoder properly, but I'm not getting any response from the decoder. The process is to read the decoded audio buffer into the wasm feed buffer. I resized the internal buffer to match the length of the wav's audio buffer size. Then once the feed function is called, I iterate through the 98304 values 160 at a time.
At this point It's not obvious where things are breaking, so I might need some futher assistance with this one.

Comparing the interfaces between the android and swift projects. So far as I can tell, the only difference is that I'm directly referencing the audio buffer in my version and hard coding the sample_count. But since I'm decoding a fixed length array I can do that. Or, at least I think I can.

So the first function here was to compare the float values going into the feed function. The second is modified to use the feed array's pointer and length. Module._feedDecoder(FEED_POINTER, FEED_LENGTH); This would read back the same values as were from the encoded audio. Which I should confirm is encoded at 8000hz.

Inspecing the data. So the floats that are read back from the array are as follows for the first 10 values.

However when insepcting the values in the audio of the hello world wave form, they show a rather different set of values. The float array should see 5 positive values followed by 4 negative values and a positive one. I should investigate if there's a way to export the float values from the wav file, or it's possible I might have to write a different way to save out the encoded message.
Indeed, Garbage In - Garbage out. From the looks of it It's possible that the audio was exported at 48khz? if that's the case, then I could skip values and see if I can sort of do a bad resampling, so every 6th value. I'll have to look into that.

So, it seems like that's not going to be usable data. So garbage is garbage. Though, it does make sense that if the sample is supposed to be 8000hz, and that means 8000 things per second, and the export should be about 1.3 seconds long that's about, there should only be about 10k floats. And the float32 array length is roughly 98k, so that should have been a clue right there that something was off.
duh...

While researching setting the sample rate on the audio context I found a reference to the Offline Audio Context ๐Ÿ”– that could be used to resample audio. It's quite possible that the input is indeed 8000hz, but the web browser is inheritly upsampling it as it's read in from disk. The wav file is 64kb which is rather small. I believe the byte size of a float 32 is 4, so it's roughtly 16000 samples, or possible 8000, but there's a channel written with just zeros. Either way, 98k float values from a 64k file makes no sense, so indeed there's very likely some re-formatting of the incoming data that expands the 64k into 98k, oh that's 98k floats, so in bytes that's actually (98 * 4) 392k of bytes! POPPYCOCK! I say! these are not the floats you're looking for. So this could very well explain why nothing is being decoded. Lets do things the hardway and just write the roughly 10k float values from the exporter into a custom format. Back to the encoder.html to save out a float blob!

const blob = new Blob([signalbuffer], { type: 'application/octet-stream' });
// Create an anchor element
const downloadLink = document.createElement('a');

// Set the href attribute to the blob URL
downloadLink.href = URL.createObjectURL(blob);

// Set the download attribute (filename)
downloadLink.download = 'helloworld.blob';

// Set the link text (optional)
downloadLink.textContent = 'Download Float Data';

// Append the anchor to the DOM (e.g., inside a container)
document.body.appendChild(downloadLink);

// Programmatically trigger the download
downloadLink.click();
setTimeout(() => {
    document.body.removeChild(downloadLink);
    URL.revokeObjectURL(downloadLink.href);
}, 100);
    

Octet stream it is, so i'll save the helloworld.blob from the encoder. I'll need to test reading this back into check that it's actually audio data, so i'll also need to make a blob reader that converts the floats back into audio and plays it back.
So many hoops!

So it's about the same size as the wav but it's also uncompressed. Additionally, the problem was not with the wav file itself, as observed in the audio editor, the first 5 samples were different in the saved file than they were in the read back from the wav->audio buffer in the WebAudio api.

The first attempt to read the blob didn't work. Reading back the octet-stream isn't as simple as I had thought. I'm pretty sure the correct buffer is being written to the blob, just seems like the reader is trying to open the file as text. More to come after some sleep.

2024 02 29

Plenty of things planned for the weekend already.๐Ÿ—“๏ธ

๐Ÿช„ Code Witches

I should add some keyboard input for Code Witches; a little box where you can enter text and have it automatically playback the CW, as letters are played back the left edge of the input field will delete the characters you've entered and it's played back, the remaining characters in the field should remain editable.
I should also add a slider for words per minute, as well as a configurable dit and dah time. By the standards the time between letters and time between words is set by the dit/dah times.

2024 02 28

I might simplify a fork of the example.cc to have the short feed/read buffer so I don't have to sort around the cyclical array buffers.

๐Ÿธ Ribbit

Buffers buffers everywhere. Discovering interesting weird behaviors and limitations when it comes to storing the buffer in memory. First is once the audio wav.buffer is received through the fetch, you're not allowed to wait to use it. If you store it as float data, the following reads return an array of 0s.

In context it's a buffer 98304 long, iterating through 100k of values in any context is going to be pretty slow. I'm thinking if it would be better to make that buffer in the wasmModule's memory.

2024 02 27

It's sort of the next day. Anyway, been writing javascript all day for work. I managed to write a nice little particle system for one of the projects. Thanks to the github copilot, the task was nicely accelerated an as a result I was able to get a lot of work done. I'm going to take a break from coding for the rest of the day. I'll be back at it tomorrow.

Ok, I lied. I'm messing with feeding the array into the decoder through an animation frame. I might need to re-organize how i use the different memory modules since the original decoder setup was using a cyclical array. (at least I think that's what it's called.)

But again, I'm running late again, should get some ๐Ÿ˜ด๐Ÿ’ค

2024 02 26

It's sort-of raining again.

๐Ÿธ Ribbit

Tonight i'll get a wav file out of the encoder so I can feed it back into the decoder.

Well fancy that, the gitHub co-pilot helped a bit fill in the bufferToWave function. Got the bulk of it from this ๐Ÿ”– example. The original function the co-pilot came up with didn't seem to work correctly. But it did help make the automatic download after the audio plays. I'm seeing the chain of 0.00000 at the end of the sample. I think that's on purpose, something that could get filled in later on with extra data maybe?

The above is the wav output from the new encoder.

So exporting the wav file wasn't so bad, thanks to some fancy AI code assistance. So lets feed the wav into the decoder and see what we get.

So when the wav is fetched the result is converted into a buffer. The buffer is a Uint8Array. To make a single Float32 a conversion using DataView is done which takes every 4 bytes and converts it to a float. I suspect that the conversion from four ints to a single float32 isn't what is expected by the decoder. Though, I am playing back a wav file that has it's sample rate set to 48000hz single channel.

So it's interesting that the float32 array that's needed to send to the decoder might actually be every 4 ints? Oh fuuuu, silly me.

The uint8Array I was looking at was the wav file data, not the decoded audio. I was in the wrong part of the function. So yes, I believe the Uint8Array is not what I should be looking at.

Starting off I need to just 160 values into the chunk of floats that gets fed to the decoder. But it's late, so I should get some sleep.

2024 02 25

Lots of anime I need to catch up on. I'll see how far I get on some of the next steps. I should also look at making a fork of the ORI git repo on my personal git and pushing my changes. That way I can make an official pull request to the ORI git repo.

Well that took forever, updating apps and dev environment.

It's late, I have work in the morning. ๐Ÿ˜ด๐Ÿ’ค.

2024 02 24

Using GitHub's Copilot. Some yardwork, and encoding. Also reading an interesing article ๐Ÿ”– for javascript nullish coalescing operators. I like the brevity of these operators, but they can be a bit hard to read when reviewing code.

๐Ÿธ Ribbit

Setting up for some encoded text to an audio signal. There is an existing web app that does this but with a different wasm module, so the interface to that is likely quite different.

๐Ÿฅณ Hello World.asdasdasdasdsd Some random extra letters just to see if i wasn't just seeing things. Seems like I need to confirm the size of the required buffer for floats to send. It's a big long right now, but that's nothing I can't fix.

Next is the 8000hz spectrum to visualize the audio feed going into the decoder, and I would like to make one for the out-going encoded message so I can compare.

2024 02 23

Happy Friday! Time to pick up some food ๐Ÿ˜‹ and start testing some decoder stuff. Also should look at getting the full buffer out of the encoded message. I should also at least look at the Code Witches stuff so it doesn't get too cold.

๐Ÿธ Ribbit

Tonight I'll be testing the pointer chase in the array, need to make sure the reading never catches up to the writing.

Something tells me that there's something fishy going on in the reading of the buffer. The numbers in the above image should at least be reasonable float32 values, and most certainly not -nan, negative not a number to boot! ๐Ÿ˜ฑ
At the very least I can see that the feed is being read into the decoder, and there aren't any really weird errors going on, other than feeding the decoder unreasonable numbers.

So, thanks to GitHub's coPilot AI, I manged to get a very useful clip of insight into a simple problem that I'm pretty sure would have taken a long time to look up. The conversion in the EM_ASM block to convert a bool, to an int and back into a bool. I suppose the limited types that can be shuttled between webassembly and javascript have workarounds, but !!$0 isn't exactly intuitive. I'm not sure if this is a good thing or a bad thing, but it's certainly a thing. ๐Ÿค”

Using printf everywhere isn't the best thing to do, but it's rather difficult to step through a compiled binary.
The above did result in repeated buf, accumulated, processed dumped to the console. So there's certainly data being fed into the decoder.
I repeatedly transmitted from the ribbit app on the phone, but never got the decoder to change behavior. I'm assuming the analytic(audio_buffer[i]) is where everything gets kicked off. But that code is pretty convoluted and fairly hard to debug since it's mostly a bunch of math conversions and nothing I can figure out the innerworkings of. So that's where I'll need to ask some questions about what data it's expecting.
Doing so just shows a lot of 0xABCDEF and line.put like assembly. I tried adding the -g option to the build of the module, but that just meant that there's be "feedDecoder" followed by 0xABCDEF etc. Also, I ate dinner about 5 hours ago, so I'm glad I got an extra sandwich when I ordered food earlier, ๐Ÿฅช.

But the float array in the feed is looking pretty reasonable. So this I'll count as progress. I may look into the latency of the audio feed, perhaps the samples are actually skipping values breaking up the continuity of the values flowing into the array. Something I might do is make a version of the color spectrum for 8000hz and not 48000hz to help visualize the audio feed. I think that might help me understand what's going on.

That's all for tonight, I'll begin working on encoding text to audio tomorrow. I think I'll have to look at the Code Witches stuff as well, so I don't get too far behind on that.

2024 02 22

But not for too long. Some Ribbit buffer tricks and Code Witches for this week. After getting some work done tonight i'll begin working on the double buffer. So a buffer that's 2560 float values where the write gets 256 * 5 ahead of the read that's 160 * 8 behind. Confusing? yes, but I think it's the best scheme that has a chance of working.

๐Ÿธ Ribbit

Working on an isolated test again to see if I can get the pointer offset working on the same array. This also means throwing out that worklet code, kind of sad about that. It's really annoying when someone gives you a part of an API that's incomplete.

Something interesting in the Edge browser's debug tools is the audio context debug panel. 480 frames with 2 channels. Though I have been using channel(0) all this time, it's interesting to note that it might be worth while to use a channel merge to mix down any stereo mics into a mono signal. Or at least inform the user that i'm using the zeroth channel. Which ever that might be, left or right... But it's just one of the two channels available.

Racing to keep ahead. I think this is the solution, though in my head I think the 160 read might get lapped? though in theory, because they both end up at the end of the 2560 long array at the same time, sort of, the read shouldn't ever get lapped.

My test for this will come tomorrow, but if i set the writes sequentially, say 0 for 256 chunks, and 1 for the next 256 then 2 for the following 256, so its [0x256][1x256][2x256] etc. The reading should always have [0x160][0x96,1x64] lower numbers followed by higher numbers. If the write ever passes the read, then we'd have higher numbers followed by lower numbers.
I should be able to write a test to check for that, but it's 3am, so I'll leave that for tomorrow. And tomorrow is Friday! so I should be able to do some more detailed work ๐Ÿ› ๏ธ.

2024 02 21

Time to do a bit of nothing. It's important to take a break every once in a while.

2024 02 20

Various schemes to pipe data from the audio feed into the float32Array. With a frequency of 8000hz if we keep the buffer at 160 samples, then we can feed the decoder 50 times per second.
Bug I think the minimum size of the buffer is 160.
I'll have to test that.

Yes, that's the case, setting a buffer to 160 throws an error requiring the minimum size of a buffer be 256. So, we need to come up with a system to match up buffer reads with sampleRate. I wonder if we can count samples. So this means that at 8000hz, we are feeding the buffer 31.25 times per second, the decoder expects 50 times per second.
I need to come up with some way to feed the decoder in at a different rate than the scriptProcessor operates at.

The problem

The WebAudio API runs samples per second determined by how quickly it fills a buffer. With a sampleRate at 8000hz a buffer of 256 is filled 31.25 times per second.
The Ribbit Decoder expects 160 samples 50 times per second at the same sampleRate.
The two systems must share reading and writing to a shared array.
So we need to figure out how to fill the buffer at a different frequency than it's read. And reading should not reach a part of memory that hasn't been updated.

  1. SampleRate must be 8000hz
  2. The WebAudio API requires a buffer set to a size of 256, 512, 1024, 2048, 4096, 8192, or 16384
  3. The web audio processor samples per second is determined by how quickly the sample rate fills the buffer.
  4. The decoder requires 50 samples per second at 8000hz (a buffer of 160 values)
  5. the decoder must read from shared memory.

Interesting problem to solve indeed. If the shared memory is larger than the processNode, we can always keep track of writes and change where in the shared buffer we're writing to.

buffer: [0-----------------------------------------------#####?]
write:  [0-----256][0-----256][0-----256][0-----256][0-----256]
read:   [0-160][0-160][0-160][0-160][0-160][0-160][0-160][0-160]
    

It's important that at some point the reads and writes end up in the same place, otherwise a read might get past the write, or the write will get too far ahead of the read and the read might miss samples. The diagram above is just an illustration of the problem, also helps me think of what I'm looking to solve. There's no actual limitation on the size of the shared memory, just the buffer size assigned to the processNode.
(256 * 5) / 160 = 8 or 1280 / 160 = 8
If the buffer assigned to the processNode is 256, and we had a shared float32Array that was 1280 long, then we could fill in the full array every 5 frames. During that time we can read the same array 8 times? I'm not completely sure this is correct. Next i have to figure out how to make sure that the read thread happens at 50 fps, or if it's possible, can I feed it twice at 25fps?

In the updated cpp EXTERN EMSCRIPTEN_KEEPALIVE void someFloats_read(int length, int offset) we use a second offset to tell the read where in the array to begin. The javascript will look like someFloats_read(256, 512) where we read the next 256 values starting at 512 in an array that's 1280 long. For testing i'll just populate the array with 256 1s, then 2s, 3s 4s, etc. Then the readbacks should show overlaps on the second read where there should be some left over 1s, followed by 2s. (I'm doing this in an isolated emscripten cpp file)

So a single iteration seems to set and read back correctly.

So the two arrays populate and match. And I think this should work out so long as the writes are ALWAYS writing ahead of the read. The weirdness is that normally there would be some lock/release that happens to the array so it's not being read at the same time soemthing is writing. Otherwise you might read something incomplete, which would be likely be garbage data. Actually, if i set the current frame that the write is on, I can keep the read going so long as it's behind the write.

The next step is to only have the array read each segment once, but it's getting late, so I'll have to let that go for tomorrow.

[TODO]

So I think I have a solution to the array sharing between a 31.25 fps write and a 50 fps read. The if statement waits for more to be written than values that have been read. The next gate needs to be if i read a segment of the array already don't read it again. This is a bit more tricky.

There's some progress at least...

2024 02 19

Happy Presidents Day ๐ŸŽ‰! Today ๐Ÿƒthe wind here is crazy and plenty of rain as well. So far the fence repair is holding up. The ground is saturated and the mud is soft, kinda surprised I hav a yard at all considering a part of my yard is up on a hill above our neighbors.

๐Ÿธ Ribbit

So last night I left off with getting some sounds out of the encoder. TextArea -> text -> bytes -> wasmMemory.buffer -> encoder.init() -> wasm.callBack -> browser.read -> buffer -> bytes -> float32array -> AudioBuffer -> AudioContext -> bufferSource.start(); -> ๐Ÿ”Š
Seems convoluted, but that's how it's gotta go. Making some noises and listening for some noises is the basics for Ribbit, so we're getting closer. Getting some sounds out is a pretty good step forward. This does indicate that the inner workings of the web assembly module are doing something correct. Now if I can get the sound to come out as one chunk then that'd be great!

For Today

I think today will mostly be stepping through the android project and seeing how the main activity switches between listen and transmit. I need to figure out how to use the audio stream to start/stop the listener.
Unlike a native kotlin/java/swift mobile app there's not really a "main" thread in a browser that could "control" the audio buffer. Instead, the audio stream lives on it's own, the WebAudio API is a self contained entity within the browser framework. I should also look into the html "template" that emscripten can use.

int sampleRate = 8000;
int sampleSize = 4;
int channelCount = 1;
int writesPerSecond = 50;
double bufferSeconds = 0.5;
outputBuffer = new float[(sampleRate * channelCount) / writesPerSecond];
outputCount = (int) (sampleRate * channelCount * bufferSeconds) / outputBuffer.length;
int bufferSize = outputCount * outputBuffer.length * sampleSize;
    

It looks like the outputCount is also used for transmitting. So the fixed size of the input buffer is also the same as the output buffer. The symmetry there is pretty nice. The writesPerSecond is interesting, since you'd want to divide 8000/50 = 160, which is what I observe being fed to the decoder. So tonight I'll revisit the decoder and check how well I can stream 160 floats at a time into the decoder->feed(); function. The buffer itself cannot be smaller than 256, so I may have to feed a few times per audio process.
There's also the issue that the processNode is depricated and I need to replace this with an audioWorklet ๐Ÿ”– So I may start a new decoder test that uses the audioworklet in place of the inline processNode. (ah, so many things.) At the very least, I am getting more comfortable with building these wasm Modules. So with this in mind I'll build toward matching up with the android project as closely as possible for streaming audio into the decoder.

Starting with the example noise generator i'll use this to feed the decoder the array. The trick is bridging in the float32Array from the module into the worklet.
Currently working my way through reading up on the AudioWorklet interface and members ๐Ÿ”– There's a lot here, and I think this should have what I need to feed the decoder. This isn't like the previous PWA where the feed was using the audio stream in a continuous cycle.
The decoder needs to have the float32Array passed in chunks 50 times per second in 160 value array chunks in a 8000hz audio feed, so there is a pretty important difference between the previous demo PWA and the decode/encode message ribbit app.

The color spectrums/waveform/spectrogram was able to update once on every sample at the default 48khz. Basically everything set to default will work as expected. The decode/encode app needs to have an encoder running at 8khz, and every 160hz a buffer needs to be pushed to the decoder. So I have something I need to work toward. Also, i'll be using the audioWorkletNode rather than the processNode since that's supposed to be depricated. ๐Ÿ˜”

๐Ÿ“I's worth noting that because audio processing can often involve substantial computation, your processor may benefit greatly from being built using WebAssembly, which brings near-native or fully native performance to web apps. Implementing your audio processing algorithm using WebAssembly can make it perform markedly better.

From the documentation on the audio worklet node, There's a direct mention of web assembly for processing audio. So it's best that I learn how to use the audio worklet.
Figuring out how to use the constructor of the new class isn't obvious. I'd like to think it would be easy, but so far i think the constructor i wrote is being greatly ignored.

In the reference for the audio worklet processor, the constructor can take an options argument which includes additional parameters for custom data.
However; when I make the instance of my decoder node, the options variable is undefined or rather when I submit anything to the constructor, nothing goes through. So after some debuggin and searching, I find it's just broken. Or at least that's all I can conclude. I did learn quite a bit about audioWorkletNodes, but I don't think they're ready to replace processNodes just yet, so I'll end my research into the audio worklet and continue with the at some point to be depricated processNode. ๐Ÿคฆ...

2024 02 18

I need to do a bunch of debuggin, ๐ŸŒฆ๏ธ today, so I'll remain indoors for the time being.

๐Ÿธ Ribbit

I have the old Rattlegram Android repo, but it's old, so I can't get it to build in Android Studio - not really sure what's going on. Likely an Android SDK I'm missing.
I'm going to rebuild the project, just import all the stuff from the old project into a new project. From there, I'm going to get it running on my dev phone so I can step through the code line at a time and see how a working version works. Then I can begin to match the web version to a working example. That's my Sunday.

โฌ†๏ธ Building phone apps always involves a bit of time for set-up, updates, patches, and version upgrades. ๐Ÿ“ฑAndroid Studio wants an update, so I'll start there. After this, I might look into the ios version, just to compare.

Amazing that I can't even start a new Android project in Java any more, They're going all in with Kotlin, and they're really throwing Java out the window. I'm not even sure I can copy over any of the old Java stuff. I'll have to redo a bunch of stuff in kt.
Also gradle is broken in my dev environment, so I need to fix that before building the android project ๐Ÿ˜”. Downloading and installing so many things. Oh found an option to download and install the Oracle OpenJDK, so that might help build the android project. Getting native app dev environments is always more work than it should be.
After a bunch of patching and download, I can finally build an empty project. Next is integrating the native-lib.cpp from the old project.

Building and deploying a native android app.

For the benefit of being able to step through working code. Stepping through code means being able to see the values of arrays, the length of arrays and watch step by step what is going on, and how it's doing it.
Updating the ribbit android project from ORI. Seems like the versions are at least compatible with the current Android Studio, so it's downloading and installing the project libs used to build in gradle, (or so it seems, i'll know in a bit once it's done.)

        Executing tasks: [:app:assembleDebug] in project C:\git_repos\OpenResearchInstitute\Ribbit_Android
        ...
        BUILD SUCCESSFUL in 717ms
        38 actionable tasks: 5 executed, 33 up-to-date
        Build Analyzer results available
    

Well look at that! So i've deployed the android app on two devices and they can talk to one another via Ribbit audio.
๐ŸŽ‰ So I have the native Android App building to my devices. Now I can step through the activity and watch variables as they update.

Observations:

From the observations, I should figure out how to move the update of the decoder into a main thread. In wasm, there's a main task/activity loop that runs in the module. So I should have the javascript populate the float32array in the module from the browser. Then in a regular update the wasm can read what's in the buffer and check for things to decode.
Lets figure out how to get the decoder working in a wasm main activity loop, or get anything in a main activity loop for that matter.
On further thought, it's possible for a loop inside of the module to run faster than the requestAnimationFrame() function in the browser. If this happens then the feedDecoder() function will read un-updated values in the buffer, or worse read the buffer while it's being updated.
That's bad.
Instead, we should use the requestAnimationFrame() in js to fill the buffer and then tell the module to feed the decoder.
In a native app environment, the audio and feed can happen in order since there's one thread. I have to manage the order of operation manually in java. One problem though is that the WebAudio api and the request animation frame happen independent, so in a strange turn of events, it's more likely to use the WebAudio api in place of a request animation frame, since it's happening on it's own.
๐Ÿค” certainly a different way to do things, but it should work.

        navigator.mediaDevices.getUserMedia({
            audio: {
                echoCancellation: false,
                noiseSuppression: false,
                autoGainControl: false
            },
            video: false
        }).then(stream => {
            const audiocontext = new AudioContext({ sampleRate: 8000 });
            const source = audiocontext.createMediaStreamSource(stream);
            const processor = audiocontext.createScriptProcessor(256, 1, 1);
            source.connect(processor);
            processor.connect(audiocontext.destination);
            var step = 0;
            processor.onaudioprocess = e => {
                const data = e.inputBuffer.getChannelData(0);
                // copy data into float buffer
                do{ someFloats[step] = data[step]; } while(step++ < 256);
                step = 0;
                Module._someFloats_read();
            };
        });
    

The above works as expected; there's no requestAnimationFrame() and the readback is populated with updated floats. Strangely, the errors I was getting around the "sampleRate" being set to non powers of 2 (8192) don't seem to be coming up. ๐Ÿคท Not sure how that worked out. Perhaps in the last month, the WebAudio API was updated in the browser to support non power of 2 sample rates!?
That's good, {sampleRate: 8000} it is! yay!

So the ribbit wasmModule builds, I added some callbacks using EM_ASM to tell the javascript when some functions have finished writing to memory.

So i'm putting the message to be encoded into memory.

The encoder leaves chunks of audio in the buffer, but there's no indication of how much is in the buffer. Though I believe it should be a fixed size, there's nothing indicating what that size should be. In the video above, I'm playing each chunk at a time. This would be sending hello world, just not all at once.
The story so far:

  1. text in the textarea is converted into Uint8_t values
  2. values are copied into the web assembly memory buffer
  3. web assembly is told to read the message in the buffer
  4. web assembly feeds the encoder
  5. encoder converts text into audio
  6. audio float32 values are copied into a float32array
  7. browser is told to check the float32array in web assembly memory
  8. browser feeds the float values into an audio context buffer
  9. web audio plays what it has in the buffer

So the encoder is indeed working and I'm feeding a message to encode, and the encoded sound is being made. I just need to figure out a more reliable way of playing it back! To get this far, it's important to note, the audioContext sampleRate to play back the signal is 8000hz, and it's a single channel. So the audioContext created must match. I was debugging for a while and the default setting for the audioContext is 2 channels, so playing back the signal would fail - silently. ๐Ÿคฆ (pun intended). no errors, just silence. So i had to dig through some old code to make sure the browser could make any noise at all.

    const button = document.createElement('button');
    document.body.appendChild( button );
    button.innerText = 'Sin Wave';
    button.onclick = () => {
        const audioCtx = new AudioContext();
        const wave = audioCtx.createBuffer(1, 40000, audioCtx.sampleRate);
        const buffer = wave.getChannelData(0);
        for (var i = 0; i <  wave.length; i++) {
            buffer[i] = Math.sin( i / 600 ) / 10;
        }
        let buffersource = audioCtx.createBufferSource();
        buffersource.buffer = wave;
        buffersource.connect(audioCtx.destination);
        buffersource.start();
    }
    

A short snippet to make a button that makes a sort of click noise. I had one somewhere that actually made a beep, but anyway. I have partial encoded signals being played through the browser, I just need to have the length of the float array sent to the javascript so i know many floats to load into the audio buffer!

The decoder is a bit more tricky but I suspect there is a similar problem with the length of samples being fed to the decoding function. But more on that later!

2024 02 17

Figuring out wasm in more detail continues.

๐Ÿ“ฆ WASM General

Callbacks and pushing values into js from the module. Last night I spent a lot of time allocating setting and reading various kinds of types to the wasmMemory.buffer without any weirdness. I tried doing things like telling js that the length was 512, when the buffer in the module was only 256 and the js "thought" it was writing but the module ignored anything beyond the length of the allocation, no errors.
Potentially bad behavior; it would be better to get at least a warning that the js was trying to write into unallocated memory.

So from the cpp in the module, the EM_ASM macro will call the encapsulated function in the window scope of the html.

Either of these setups seems to work correctly. The problem is getting data from the body of the module into the function. Though the function in js it calls could just be another call back into the module.
module->invoke js function->js->invoke module function->get value from module.
It's a bit round-about, but would work. Ideally, module->invoke js function with a value. This would be good for: decode signal -> send decoded value to js.
๐Ÿ“šThere is another setup where a --js-library jsLibrary.js can be added to the build command, this tells emcc to include additional js functions to be added to the window/global scope, but I think there are still similar limitations. Though it would be odd to pipe say, a long array of Uint8_t values to the js through a function call, so perhaps a method ding_Fries_Are_Done() would indicate to go get the ๐ŸŸ.
Or rather, ๐Ÿ›Ž๏ธ decoding is done, go read the buffer to get the result. It's also possible that there would only be a single function call required, e.g. eventFromModule("get_decoded_message") and/or eventFromModule("get_color_array") would just invoke a switch statement ready in the js to trigger different functions, or even dispatch a custom event with the incoming string.
๐Ÿค” DispatchEvent(new CustomEvent(messageFromModule)); and any js function waiting for the event would execute after getting the custom event, I kinda like the simplicity of the event dispatching from a single point of entry coming out of the module, certainly reduces the number breaking points.

Turns out, the formatting with the above where the first encapsulated call is made can pass a value from the module into js!

Okay, so we'll use that since it's something that can be done in one line! Compact, no additional files to add, and it does the thing I need, which is pass a "result" value from the decoder.

๐Ÿธ Ribbit

So back into the example.cc modifications. One of the issues I was seeing was _get_Pointer() function calls were returning 0 which is unlikely, I should throw a warning if any one of the pointer requests returns 0, this would indicate something is broken.

2024 02 16

Working toward a better understanding of the wasm stack/heap? So there are a bunch of objects related to different sections of memory in the stack. Heaps of memory for different types. I was able to check on float values making it into the stack and having a function see them.

๐Ÿธ Ribbit

I believe the issue I was seeing last night is related to Uint8 types being mangled because they were not being read correctly from the same stack. So tonight I'll build a different test that just pushes and pulls Uint8s Float32s and other types of arrays into memory for various functions to read them back, make changes and send them back to javascript to check on the round trip.
Also I get a 3 day weekend, and it's ๐ŸŒง๏ธ๐ŸŒง๏ธ๐ŸŒง๏ธ so i'll be trapped indoor, and I'm done fixing up the fence, so I'll have plenty of time to muddle my way through wasm Memory ๐Ÿ’พ space.

Hoping to confirm a few things:
๐Ÿ” Getting data to survive the round trip into and out of the stack. I also want to check on other ways of getting some data into and out of the wasmModule.
Is it possible to call a javascript function directly from wasm? Like an event, const onWasmEvent = e => console.log(e) and in the cpp, void wasmEvent () { callEvent("done"); } and in js i'd see "done" appear in the console? This would be great that way every time a message is decoded i'd be able to update the text area with new messages. Right now i'm polling the stack and just updating the text area with whatever is in the stack, and right now it's just a bunch of garbage data. (Probably a bunch of passwords stored in a random part of the browsers memory ๐Ÿค”)

Also, found a nice article on various array copy methods in js. Might use one of these instead of the for loop top copy a js array into a wasm buffered array.
So far as wasm tutorials go, not too many of them elaborate on a slightly more complex project involving more than just one use of the stack at a time. It seems that venturing into more than one function pulling/pushing into the stack with different types is beyond the scope of most tutorials.
I may have to write a tutorial on my own findings ๐Ÿง‘โ€๐Ÿซ...

something to look into tonight as well, $ emcc example.cc -o test/test.js -msimd12 enable SIMD and adding the #include โ‰บwasm_simd128.hโ‰ป to decoder.hh to make the polar function faster, I did this, but it's not faster, and the same warning about memory persists, but maybe I need to use the vector functions from the simd lib to make any use of the lib. (more to come)
reference simd.

๐Ÿช„ Code Witches

Also hoping to get a map view and transitions working. Also may as well look at doing some work on a tutorial setup so when you first try out the demo, you can get a nice intro into the UI.

๐Ÿ“ฆ WASM General

๐Ÿ”ฌTonight is more or less slated for some very structured investigation into how wasm memory works. How to get floats ints strings and all that into and out of the wasm stack. I'll also figure out if it's possible and how to trigger functions in js directly from wasm as events. So something like onEvent = (event) => { console.log(event) } can be raised in js from the wasm module.

Setting and getting an int, as I would expect; passing a single value to a variable works easily.

storing an int named someInt in the wasm is easy to change and get. It's easy to get a single value from a wasm function with a return value. So how to we deal with a long set of float32 values?

So the above code should fill the wasm memory with values from 0.0 to 255 * 0.11111. And it does!

And we can have the module and the js both read the same block of memory from the wasm memory buffer. This is how we send the decoder.hh a block of floats that was recorded by the WebAudio api, where we set the float array followed by the feed function reading the buffer we updated.

For clarity, I was curious of the pointer value, and turns out that it's not 0, or rather 66304 so the address in memory is certainly not the Zeroth position in the stack.

As expected, the Uint8_t prints properly as well. and the pointer address is different from the float32 array. 67344 this happens to be 1040 away from the float array's address. So the ribbit code i've been working on follows the same pattern, so where is it breaking?

2024 02 15

Getting close to encoding/decoding in the browser; (at least I think.) Setting up new projects in 8thwall for work, and lots of anime.

๐Ÿธ Ribbit

So last night I got the float array from the audio as float32Array into the wasmMemory.buffer. Then my processFloatArray() in the cpp was able to read out the array.
This turns into a path into the decoder. The same data pipe is used by the spectrums, so it's performant!

Spent the better part of an hour debugging some audio problems, turns out my mic was on mute๐Ÿคฆ... Okay back to getting stuff working.

Something seems off, not sure if it's the sample rate throwing things off or not, but I'd guess that something other than garbage would be coming out of the payload if that were the case. The mage shows the output of a string converted from a byte array followed by the byte array.

Somehow, the above image passes along a byte array to the wasmModule.

But then in the module, the above gets dumped, a very long series of values. Each time I press the send button to call that function it gets a different array of values.
SO! this means basically the buffer i'm pushing bits into is not being set correctly, and or the bits being read are not the right ones.

I think im doing this right, so i'll need to write up a few tests to check on the code. I think I have the pointers or the lengths wrong.

๐Ÿช„ Code Witches

I got a cute little map pixeled out, I just need to name the buildings and create some interior views so you can talk to the people in the buildings.

2024 02 14

Happy Valentines day and Ash Wednesday. Planning on looking at how the input is piped into the decoder tonight.

๐Ÿธ Ribbit

In the android project the initAudioRecord code sets the sample rate to 8000 and the reads per second to 50 and the bufferSeconds is 2.

int sampleRate = 8000;
int sampleSize = 4;
int channelCount = 1;
int readsPerSecond = 50;
double bufferSeconds = 2;
recordBuffer = new float[(sampleRate * channelCount) / readsPerSecond];
int recordCount = (int) (sampleRate * channelCount * bufferSeconds) / recordBuffer.length;
int bufferSize = recordCount * recordBuffer.length * sampleSize;
    

So for reference: (sampleRate * channelCount) / readsPerSecond turns into (8000 * 1) / 50. which is 160, I was feeding it arbitrarily 256 without any errors, so 160 sounds fine. I'll have to look at these numbers a bit more carefully tonight. recordCount is (8000 * 1 * 2) / 50; or 320, and bufferSize is 320 * 160 * 4, or 204,800. which is used in audioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, AudioFormat.CHANNEL_OUT_MONO, AudioFormat.ENCODING_PCM_FLOAT, bufferSize, AudioTrack.MODE_STREAM); the initalization of the audio track.
Then in a cycle we get:

public void onPeriodicNotification(AudioRecord audioRecord) {
    audioRecord.read(
        recordBuffer,
        0,
        recordBuffer.length,
        AudioRecord.READ_BLOCKING
    );
    if ( feedDecoder(recordBuffer, recordBuffer.length)) {
        byte[] payload = new byte[256];
        int result = fetchDecoder(payload);
        if (result < 0) {
            binding.status.setText(
                R.string.payload_decoding_error
            );
        } else {
            binding.status.setText(
                getString(
                    R.string.message_status,
                    new String(payload).trim(),
                    getResources().getQuantityString(
                        R.plurals.bits_flipped,
                        result,
                        result
                    )
                )
            );
        }
    }
}
    

I should check how often this notification is called. It looks like private float[] recordBuffer; gets populated by the audio interface and once it's filled it's forwarded to the decoder while listening. When the buffer reaches a limit it's fed to the decoder, which then returns a bool when it's done.
When true, a byte[256] array is fed to the fetchDecoder function which is then converted into a string for reading.
From the code before I think every 160 samples we'll send a chunk of floats[] to the decoder. Interesting though, the spectrum and stuff from before were sampling at a different rate than the decoder. I'm not sure, but I don't think the media interface can instance two audio sessions with different sample rates, but I'll have to try that. The decoder seems to wants to use 8000hz vs the Visualizer at 48000hz. ๐Ÿคท I have a feeling this is more of a sample counter, not so much a setting for the audio recorder.

Later tonight after work i'll start feeding the decoder various combinations of byte[] arrays and float[] arrays. I need to figure out what the AudioRecord class in java does so I can make a javascript version of it. The audio recorder in java allows the user to set the sample rate. There's no suck thing in the web audio API. The other trick here is that the AudioContext sample rate is limited to values of powers of 2, so we can't use 8000, so we'll go with 8192.

I have two audio contexts created, so it seems like they're both working. One is running at a lower sample rate (8192) than the spectrum (48000). From reading up on the wasm it seem more likely that i need to write values to the wasmMemory and have the js read from the memory. There doesn't seem like a more simple method like var words = Module.getStringFromModule(); So I'll have to set up a Uint8_t stored in the example.cc where I'd need to add getDecodedStringPointer() and a getDecodedStringLength() so the wasmMemory.buffer can store the decoded string.

So something like the above might do it. This is setup like the other spectrum functions.

well, that's a lot of 0s, 256 of them I'd guess. But that's what it should be I believe. I tried sending it a squak from the rattlegram app, but I didn't see any changes in the 0s. Oh, of course, i have to ask the Module to fetch the decoder.

It's not 0s any more, so there is something going on with the fetch function. Whether or not it's working is unclear. Again, I sent a squack from the rattlegram app and the pattern of numbers didn't change. So It's likely that the input into the decoder is missing something. Though, I feel like I'm getting somewhere with all of this. I'm also seeing about a 200ms hitch in the performance when the fetch occurs; everything freezes for a blink. I wonder how the ios version does the audio sampling. Ah yes, looks like swift version can also set 8000 for the sample rate in the AV tools. let desiredFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)! The feed function in the decoder throws an error if the sample rate is <= 288 So likely a value higher than that will go through the rest of the function.

So the trick seems to be that I need to figure out how to ask the Module for a HEAPF32 buffer offset using a malloc call. But there's no Module._malloc function exposed. So the path is Create a floatArray get a space in the Module HEAPf32.buffer, copy the floatArray into that buffer, using heap.set(floatarray) then pass the offset value from the malloc to the feed function, so the feed gets the pointer to the memory allocation start point and then you provide the length, and it'll process each value in the buffer till it gets to the length.

So it looks like, this is working!

I am indeed getting a long array of floats!

In the cpp, i have a local float feed[8192] and a feedlength set to 8192, so I think this should solve the copy into memory issue I was having. ๐Ÿ‘ yay. So copy stuff into and out of memory is not that bad.
Next I just need to pipe all that into the decoder in a way it expects. ๐Ÿคท though I'm not entirely sure what it's really expecting. since it's looking at the same feed array and it's not takng kindly to it. But it's getting late, so I'll leave that for tomorrow.

[DONE]

  1. Populated a float32Array in the Module's memory.
  2. Got two audio context nodes running, one for the spectrum, and another for the decoder.

[TODO]

  1. Pipe the feed array into the decoder.
  2. get the payload from the wasmModule.
  3. create a 3rd audio context for playing back endoded text.

2024 02 13

ok lets see what we can get working with the wasmMemory.buffer tonight.

Looks like I got audio streaming into a freshly compiled wasm Module. (simply test.wasm) So next comes the decoder and encoder.
I'll start with decoder since I think it can use the same audio feed that the other elements above use.
In decoder.hh there's int fetch(uint8_t *payload) and bool feed(const float *audio_buffer, int sample_count) So I think feed will take the audio_buffer with the sample count.

So const decoder = Module._createDecoder(); returns undefined. (not really surprised) So there's no decoder coming out of the module that I could just use decoder.feed(audio_buffer, 48000); So, I'll need to figure out how to get access to the feed function of the decoder, and I think I just need to connect the audio buffer to it.

So with the colorful spectrum/scope etc the audio is fed into the module's memory buffer byte at a time in a for loop as the audio stream comes in. The Float32Array gets filled and when the input length is reached the module is tild to stft(); which I assumes processes and clears the audio sample in the buffer. In a similar way, I think the decoder likely has a space in the buffer that I need a pointer to and a length to reach to populate with float32s. Preferably, I'd be able to point the decoder to the part of the buffer that's being fed the audio right now. So perhaps i'll just put the decoder's feed in the stft() function right after the signals visuals are processed.

Reading through how feed works there's a value called extended_length where accumulated floats that are buffered. to be processed. So feed(floats[], someValue ) someValue in this case needs to be less than that extended_length. Reading what the value for that is I see static const int extended_length = symbol_length + guard_length; where symbol_length = 256; and guard_length = 32; so for every 288 data points we feed the decoder, and maybe we try to fetch the result after that? ๐Ÿคท. I'll try that. Or, perhaps i should reference either the android or ios project to see how they do it.

Oh, I see: byte[] payload = new byte[256]; int result = fetchDecoder(payload);. I was close, an array of bytes needs to be fed to the decoder. I'll call it a night for now, I can see where I need to go from where I'm at. I'll have to divide the data from the audio into chunks of 256 feed it to the decoder, then I guess i'll need to figure out how often I need to fetch values from it.

2024 02 12

Work work, and side projects.

EDX

Joined a class to learn up a bit more structurally a bit on wasm.

๐Ÿธ Ribbit

Figuring out a few more ways to get audio i/o working. In the previous PWA the audio pointer and length were used.

        var input = new Float32Array(buffer, wasm.input_pointer(), wasm.input_length());
    

This was used to draw from the input a bunch of bits.

        const buffer = wasm.memory.buffer;
    

From the buffer in the memory the spectrum and waterfall were drawn.

        var scope_rgba = new Uint8ClampedArray(buffer, wasm.scope_pointer(), 4 * wasm.scope_length());
    

From the looks of it, I need to specify a buffer in the cpp to allow access to memory.

looks like the buffer array was made of a series of rgba values. For some reason wasmMemory isn't exported as a part of the emcc build that creates the wasm i've been working with. From the documentation there's some extra work I need to do for the wasm to export a block of memory to access from js.

        emcc example.cc -o test/test.html -sEXPORTED_RUNTIME_METHODS=wasmMemory -sMODULARIZE -sEXPORT_NAME=RIBBIT
    

Ah ha, so -sMODULARIZE made a name space collision with "module" which is somewhere in some other code I didn't wrote, but then to avoid that add -sEXPORT_NAME=Ribbit and then NOW i have access to memory.buffer but it seems like Module needs to be renamed to Ribbit.

After some tries to figure this out, I'm still having issues reaching the Module again. C:\git_repos\emscripten-core\emsdk\upstream\emscripten\src\shell.html looks like this file needs some modifications?

It's a bit late so I'll continue this tomorrow, but during the experiments I was able to see a wasmMemory.buffer and it pointed to something in memory, so there's potential somewhere around here. I just need to figure out the steps to re-create what i had done, seemingly by accident.
So What did I figure out tonight?

  1. wasmMemory.bufferneeds to be exposed when compiled. This seems to be done by adding -sMODULARIZE
  2. The old wasm had this option when it was exported, so the buffer was available.
  3. The new wasm, when built, needs a similar flag to expose the memory. I think that's the -sMODULARIZE but I'm not 100% sure. (more testing required)
  4. When the modularize flag is added an error is produced that requires an -sEXPORT_NAME=Ribbit or something else, but then I have a problem loading the html, it hangs without errors.
  5. Simply renaming "Module" in the html to "Ribbit" didn't fix the hang, so there's something more to it than this. So the error: emcc: error: customizing EXPORT_NAME requires that the HTML be customized to use that name (see https://github.com/emscripten-core/emscripten/issues/10086) and the related link are not really helping.

So tomorrow night, I'll look into figuring this out again and I need to learn more about the options required for getting buffer available to the JS.

Oh, I was trying to use Module.wasmMemory but after talking to copilot, it's just wasmMemory and buffer is there. ๐Ÿคท ok, that's enough for tonight. I'll see if i can use buffer tomorrow.

2024 02 11

Spent most of the day digging holes. Got 6 holes dug, 5 new fence posts, a treated 4x4 and 300lbs of concrete poured to help a leaning fence stand up straight; ๐Ÿงฑ๐Ÿงฑ๐Ÿงฑ๐Ÿงฑ๐Ÿงฑ๐Ÿงฑ. Hope the rain next week doesn't undo all the work ๐Ÿ˜“... But for now, I'll walk downtown with the XYL grab a bite to eat and maybe catch what's left of the superb owl ๐Ÿฆ‰.

๐Ÿธ Ribbit

Tonight i'll start looking at piping audio into the decoder.

๐Ÿช„ Code Witches

Also tonight I might look into making a map menu setup.

โ˜น๏ธ 49rs lost

Soros and Pfhizer won their sci-op. had 1.5 beers, and I'm done for the night. Start up the kerosine heater (pg&e twice as expensive as a liter of kerosine) and turn in for the night. Not going to get around to either project. ๐Ÿ˜”.

2024 02 10

Parts obtained, two fence posts dug in, concrete filled holes. Tomorrow is four more to do.

Ribbit

Tonight I'll be digging into the different functions to use SDL to get audio into the DSP. So reading up on SDL's audio systems as well as rendering to canvas so i can display the waterfall/spectrum out to the canvas.
Because it seems important I'll also see if I can include encoder.hh and decoder.hh and expand example.cc to instance an encoder and decoder.

Oh how exciting, decided to copy over all the cpp from the Android / Java project into a "transcoder" directory. now i just need to fix up a bunch of includes to point to the right files. Of course now "assert" isn't actually a thing. Swapping out quirks.hh for algorithm and replacing min max for std::min std::max so there's a bunch of little changes in the cpp that need to be done to get encoder/decoder integrated.

After a couple of hours of watching anime while chipping away at a wall of errors, I can finally at least build the project with the included encoder.hh and decoder.hh files. ๐Ÿ‘ very nice, very nice. I didn't instance either, but that'll come next.

So something like this that was in the android project where there's a bunch of java specific ornamentation added for creating the hooks into the android app have been turned into:

Nothing too spectacular, but at the same time in the browser console, I get to call Module._createEncoder() and nothing crashes, there's no evidence that the encoder is doing anything, but somewhere in the depths of the webassembly, there's memory taken up by an encoder; at least I think.

warning: the pointer incremented by 268435456 refers past the last possible element for an array in 32-bit address space containing 128-bit Here's an intersting "warning" as it were. The decoder throws this message. ๐Ÿคท I'm not exactly sure what to think of this.

But it seems to be doing the thing. So the Decoder and Encoder are living somewhere in wasm memory space so, now onto input and output. ๐Ÿฅณ So far so good.
In anycase, I need to change gears to something more visual for the rest of the night. It's been about 4 hours on this, so I need a change of pace.

Code Witches

So, onto some javascript instead. Tonight I think i'll add in a generic map menu system for navigating a "Town" map.

An interesting article on tweening using wasm.

2024 02 09

Fence repair, web assembly, and CW. After work and some pizza and a nap it's about time to figure out how to get a pixel out of the DSP libs.

2024 02 08

messing with some new css. ๐Ÿ› is this an improvement?
This weekend I need to spend โŒš and ๐Ÿ’ธ to repair storm damaged fences that collapsed around the yard.

Ribbit

Making some colorful squares using SDL. The wasm learnings continue this evening with drawing things from wasm into the canvas object living in the html document.

Also tonight is using emscripten_set_main_loop to update canvas from wasm. Looks like the SDL is the preferred system to get audio and video in and out of wasm.

Considering this as the pipeline in/out of the encoder/decoder for ribbit.

Code Witches

In parallel, I also need to write a CW decoder for the browser where I was considering using pure js to do this, but from the looks of it, the waterfall can be quite high res if I were to do the pixel manipulation in webassembly. Note the colorful noise square is a 512x512 pixel RGBA image updating quite fast in the browser.
Using regular canvas.ctx '2d' in javascript I was not able to get very much performance drawing that many pixels in javascript. So I think to get a really nice looking waterfall I'll need to do so with wasm.

SDL

SDL offers audio with using SDL_Init(SDL_INIT_AUDIO); where I'd be able to pipe the bits from SDL into the decoder. I'll need to figure out how the bits are setup, or rather what Uint8* or Uint32* and whatever static_cast from SDL i'd need to use to pipe the bits to the decoder. In the android/ios projects there's the simplex_decoder.hh where the SimplexDecoder class lives. I haven't read through that enough to understand whats going on, but it's heavily templated, so it'll require some re-reading between objects of the same name to figure out what's going on.

2024 02 07

More rain. โ›ˆ๏ธ I hope the remaining parts of my fence don't fall over.

Ribbit

Tonight I'll start to get more functions exposed to JS and we'll see if we can get input output working for text. Or at least we'll see if we can poke/peek into the wasm object.

I should also make a list of everything to build the project.
The short list:

At some point soon i should setup a github repo to host the new pwa on, but that will come once I have a working pipeline into the wasm from the html. A larger document to keep my notes specific to wasm.

Goodness, this actually worked.

So in the Module in the browser, the pointer to the image and whatever other functions exported all seem to be operating as expected, which is nice, so i'll take this as an indication that the cpp is more or less actually doing what it's supposed to be doing.

        var scope_rgba = new Uint8ClampedArray(
            buffer,
            wasm.scope_pointer(),
            4 * wasm.scope_length()
        );

        var scope_image = new ImageData(
            scope_rgba,
            wasm.scope_width(),
            wasm.scope_height()
        );
    

From the pwa the image for the scope is an 8 bit array from the module. and then it's drawn to the canvas in an animationFrame. Simple enough! I'll have something to bite into this weekend aside from digging post holes and fixing a fallen fence from the rain storm this past weekend.

Getting so close. So the original pwa has in it a reference to where const buffer = wasm.memory.buffer; but from what I'm able to tell, I need to assign a new WebAssembly.Memory object, but I'm not sure how the new array is populated. I'd guess that in memory the pointer references the start of the array and then somewhere it's populated. I think that's the stft() call?
I'll have to dig into this a bit more, but I think I'm on the right track.

Code Witches

Map View! ๐Ÿ—บ๏ธ or at least we should make a map for the school so we can get the player introduced to using CW to navigate the world a bit.

2024 02 06

Back to it.

Ribbit

Finally figured out some resolutions to the PATH issues. "LLVM_ROOT" now points to "LLVM/bin/clang.exe" So that's good. Now i'm finally able to build a .js file with the wrappers to the exported functions.
What a pain in the ๐Ÿ‘.

On another note, I've also managed to get the VSCode cpp stuff figured out as well, at least in an isolated test. Time to move on to the rest of the code base to see how well that behaves in the wasm context.

2024 02 04

The fight continues.

Ribbit

So I guess ESTERN gets defined in the cpp

        #ifdef __cplusplus
        #define EXTERN extern "C"
        #else
        #define EXTERN
        #endif
    

Defined as simply extern "C" so i guess that's just a simplification, so I'm not sure if it's completely necessary since `extern "C" EMSCRIPTEN_KEEPALIVE` is the same thing. ๐Ÿคท meh whatever.

Scoop broke my installation of emcc, now clang.exe is pathed to /current/clang.exe not /current/bin/clang.exe which is a bug, and I'm looking for a fix. the lack of the /bin breaks the build pipeline, so now I can't get anything to work. โ˜น๏ธ

๐ŸŒฌ๏ธ๐Ÿƒalso pretty strong winds blowing here, so it's really likely my power will go out tonight since my area is highly prone to power lines going down when there's a strong breeze. Why do I have to pay PG&E nearly a thousand dollars a month for this? Oh ya, because we give hundreds to them for green energy and we've lost energy independence, and solar panels and windmills don't work when there's anything other than mild weather.

Power out...

Power was out for most of the night, so there goes Sunday. On another note this allowed for some indoor end fed halfwave antenna action (wind and rain preventing me from outdoor fun at night). So once the power went back on the waterfall went white with QRM (man made interference) But before that I was listening in on SSB phone from Italy, India, Australia, and South Africa. Noise floor was pretty low aside from storm QRN.

2024 02 03

Saturday, and some house chores.

Ribbit

Going to make an isolated test to figure out why EXTERN isn't macroing in from the emscripten.h header. But that's after I install a new door for the bathroom.

The difference from the end of last month and today is I'm working with a larger set of code from the DSP included into the cpp. Technically if the anything the spectrum executing in main should be included.

Tonight I'll be building some more tests and looking at how to get the emscripten canvas object drawing pixels.

So looks like I had a few different paths to python and cmake. So cleaned that up, but one of the apps required 'llvm' wasn't installed in the path yet. So I've also installed 'scoop' to install main/llvm.

Configs...

So from a bit of fighting with where cmake is, my bash shell never had a .bashrc so after adding one I was able to add the following:

        export PATH="/c/Program Files/CMake/bin:$PATH"
    

This then allowed the following:

        $ which cmake
        /c/Program Files/CMake/bin/cmake
    

before this the path was pointed at some random cmake in a python39 scripts directory, which is odd since python311 is running in the bash environment. So why all this. As a part of getting emscripten to work the different components cmake, llvm, python, and java, all need to be properly installed. With my environment being battled on for a while, different versions and installations of cmake, llvm and java have all been installed before. Because of this, the emscripten install, which also has assumptions about installs, didn't know which version was installed of llvm and cmake. So things like `emconfigure ./configure` would fail since emcc was pointing to different versions pf python which didn't know which llvm to point to etc. This is causing many problems.

        $ emconfigure ./configure
        configure: ./configure
        emconfigure: error: './configure' failed: [WinError 2] The system cannot find the file specified
    

So the configure file can't be found because the paths are all pointing at different locations. At this point you'd think it would be easier to uninstall everything and reinstall from scratch. But from past experience there will still be locations by bash, python, java, etc that hold onto paths that will persist to confuse even after uninstallation.

I can't really think of anything aside from nuking and paving from windows up, this ensures there's nothing left that could interfere with a clean installation of emscripten. Anyway, I'll continue to figure out how to fix up some more emcc paths or better ways to install various components so emconfigure and emake work. Then I still need to figure out why the include for emscripten.h still doesn't see the EXPORT macro but it does see EMSCRIPTEN_KEEPALIVE for some reason. I'd imagine they're both defined in the same file, so why recognize one but not the other?

        Installing SDK 'sdk-releases-e5523d57a0e0dcf80f3b101bbc23613fcc3101aa-64bit'..
        Skipped installing node-16.20.0-64bit, already installed.
        Skipped installing python-3.9.2-nuget-64bit, already installed.
        Skipped installing java-8.152-64bit, already installed.
    

When updating emsdk i see it's pulling python 3.9.2 which I'm assuming is in a different path than the 'python' that bash recognizes. Problematic isn't it. Unfortunate though I have a different project using python311 so I can't really just uninstall that. Well, perhaps I'll figure out how to get bask to see python in the emscripten path using .bashrc ๐Ÿคท hope that works.

Code Witches

I'll also start building a map system for navigating in the world using grid squares. So I'll need to build some sort of town map, for navigating in town, then a world map for navigating from town to town.

From town to town navigation the player will have to travel through hostile areas and encounter monsters. In each town the player will be able to buy new equipment visit swap meets to upgrade their equipment and find new spells.

2024 02 02

Managed to get out od jury duty. But for some reason I'm mentally fried today.

Ribbit

Lets figure out how to link other libraries into an emscripted project. Going by this we'll look at how external libraries are integrated into a project. Some of the setup for emscripten and cpp projects in general is adding a PATH variable for every library. For ribbit i'll need to add in the emscripten libs along with some stuff from the original DSP cpp.

        "includePath": [
        "${workspaceFolder}/**",
        "C:/git_repos/emscripten-core/emscripten/system/include"
    ],
    

In the .vscode directory in the project there's a generated `c_cpp_properties.json` that needs the above line added to the includePath. and now the include emscripten.h no longer gets red squiggles.

There's a lot of const static declared at the example.cc that seems to break the compiler. I guess at the file level for this: emcc complains "you cannot define an array with a variable size at a global scope"

So many rules... so really all the defines could boil down to just an int defined here and there. really though, if we define max and min to just the the larger and small of a value we already know the final result don't we?
I'll just replace these with the calculated int values. What could go wrong?

had to rename _start to main(), also a bunch of other small changes.

        emcc -o test/test.html example.cc -s EXIT_RUNTIME=1
    

To build this I had a test directory where i have:

opening up the test.html I get:

were main() is executed and I'm assuming the rest is executed before the printf("hello world"); is run. So somewere in memory there's a spectrum being drawn, I believe. Now i've got to figure out how to export the other functions so js can see them and make some changes and request pixels to draw to a canvas.

So that's exciting, managed to get a wasm out of example.cc through emscripten. A bit of cleanup moving the rest of the DSP:: into a sub directory to organize and then to start into figuring out why `EXTERN EMSCRIPTEN_KEEPALIVE` doesn't seem to macro in from the emscripten.h file.

        emcc -o test/test.html example.cc -s EXIT_RUNTIME=1 -s EXPORTED_FUNCTIONS='["_main","_myFunction"]' 
    

All this is still building, but for some reason the expected _myFunction isn't appearing in the test.js file after _main.

2024 02 01

Happy February,
The fun doesn't end now. In anycase, I've been continuing the dive into emscripten and an interesting dive it's been. Going through the tutorials, and understanding pipeline has been an interesting experience. Also I have jury duty at 1pm ๐Ÿ˜“.

Ribbit

For a bit now I've subsisted off of nothing more than the emsdk, but now I'm getting into the SDL part of the tool set. Also quite interesting. And drawing colorful squares wont be where I stop tonight. So far there's mostly R&D into wasm modules and getting familiar with all of that. Most of my projects before this were for large web apps that used various libs like three.js, matter.js or 8thwall.
Most of the time the interfaces were already written for me, so this is a new experience looking into how to write my own interface, so I'd like to understand how things work under the hood.
8thwall in particular operates with quite a blackbox of code where an async load fetches a wasm file. So i've learned to appreciate what's going on behind the curtain when using their sdk/api.

emcc -o hello.html hello.cpp -s NO_EXIT_RUNTIME=1 -s
"EXPORTED_RUNTIME_METHODS=['myFunction']"

So in the most basic terms:

functions in the cpp get exported to the js file as they've been added to the exported runtime methods.

This also assumes the function was predicated with

    EXTERN EMSCRIPTEN_KEEPALIVE

Porting from the previous version of the ribbit pwa cpp will require a bunch of small modifictions to the code to have it playce nicely with emcc, and at this point i'd like a generator for the command line as to avoid typos.

Code Witches

Also with this project I've been watching some streams on youtube involving how people learn and use CW on the air. So as a matter of utility I'll try my best to use common abbreviations that one would encounter when activating a park or summit into the game.

GN

Anyway that's enough for tonight, it's after 2am.

2024 01 29

Happy Monday.

Ribbit

Tonight i'm spending most of my time reading into the depths of how wasm works. I really don't like not understanding exactly what's going on when I deal with a library, especially if it's something I'm working on porting. So to that end I'm actually figuring out a minimal pipeline for the implementation of a wasm binary into a web page.

To some degree I'm starting to understand the overly complex hello.js that's generated by the generated by the emcc output. It's also an example of how to make something overly complex as a "simple" template. However I do understand why it's complex.

Automatically finding all of the exported functions and wrapping them into a js function call isn't trivial. From this i saw:
... You could then build your custom HTML completely from scratch, although this is an advanced approach; it is usually easier to use the provided HTML template.
Possibly - unfortunately; I'm not one who would simply use a template. ๐Ÿคท The easy route may be faster but I don't think I'll understand how it works.

Reading from the mozilla docs I should be using the instantiateStreaming function. Which the emcc template js file doesn't do. So instead I'll do the hard thing and figure out how to use this instead.

To understand what's actually going on with the wasm I also found: wat2wasm the wasm binary toolkit. Which shows a bunch of tools to deconstruct assembly files for debugging.

Another important reference for understanding all of the wasm stuff. If nothing else; tonight has been good for finding a bunch of references and tools. Also everything tonight has referenced rust before c++ so that's somewhat telling, isn't it.

Before the end of the night I've managed to find a bunch of new references and tools, so that's good. I've also manged to get my simple hello.cpp to flood my browser's console with a ton of lines of numbers being spewed out by the fd_write function not actually ending and printing "hello world" like I expected it to.

2024 01 28

Got the end of the battle to show the status view where we'll increment XP and show things like pickups. I'll need to add the 'e' to continue to the view.

After this the player will need to return to either a dialog or a map view. so I'll need to build the navigation/map view next to indicate to the player if they're in a town or traveling in the wild.

Some transitions should also be added so the change from dialog/battle/status isn't so abrupt. Sound effects would also be pretty nice.
[TODO] Next up is a map view system. So the scene will need to produce some dialogs for navigating between scenes!

2024 01 27

Plans for the weekend. Along with wandering about town...

Code Witches

Looking to get the "Battle End" sequence setup once all monsters have been defeated. I should also integrate some sort of animation system into the code base for CW. I should also add to the game design with a sort of UI sequence so i know what screens are left to build for the UI. Also referencing this video and some more references for some CW abbreviations to use in the CW game.
A Sequence of keyframes[] would help.

        keyframe = {
            time: 0,
            func: () => {}
        }
    

So i'd have some sort of deltaTime and a timeStart/End to set when each keyframe executes it's function. This would be for things like showing and changing images then changing stages and characters in the scene. I should look at the possibility of storing executable functions in the indexedDB of the browser.

Also amusing that one of the anime shows i've been watching happens to include this as a central plot for the episode today.

Ribbit

Also this weekend I'll look at merging more of the encoder/decoder into the wasm.

DiceBoxx

I'll also look at setting up the Krea AI app to start making some more art for a friends dice app.

2024 01 26

        emcc hello.cpp -s WASM=1 -s EXPORTED_FUNCTIONS=_square,_main -o hello.html
    

The calls in my isolated example are working now.

So this was from the example3, and I'm just checking on the function doing a thing or not.

The result just sends an int to the html from the wasm as the for loop iterates through the spectrogram, so that's good.

    extern "C"
    {

        int square(int x)
        {
            return x * x;
        }
    }
    

I also included the code above as one of the "EXPORTED_FUNCTIONS" in the wasm as a simple test where in the web browsers console i can type in 'Module._square(3)' and get '9' back. So that's good. Moving right along. So I'm pretty sure the rest of the DSP library will be something I can continue to chew on to get the rest of the Ribbit decoder/encoder working in the wasm file.

2024 01 25

Ok, so I managed to get most of the cpp included in an emscripten wasm for my isolated test.

2024 01 24

I would like to get a wasm of some kind to build. The cpp that was included for the pwa demo was just a decoding waterfall and waveform without any of the cool bits that make up ribbit.

2024 01 23

Getting into some more rattlegram stuff, looking at the PWA project tonight.
Looking at some of the resources on aicodex where we have links to the modem the system used for error correction and the audio to digital signal processing.

After i get an app working I'll start rust porting with the above libs. Mainly because of the direct to wasm->js compatibility. There's really nothing between rust and js once you've got a working project; It's magic.

On a more obscure link there's a wasm example that shows a waterfall and waveform. There's this repo for the PWA example so I'll begin with this. Though there don't seem to be any notes on building the wasm from the cpp.
Looks like there's a .cc and .hh in there, so I'll try to use something basic like emscripten to make the wasm binary. There was some suggestion of some other cpp to wasm tool of some kind but it looked dubious to me for some reason.
Just had a lot of awkward run-ins with companies that look/feel like that, usually it's that their tools are half baked and after a while they get bought for billions by google/meta/apple/microsoft, then disappear and end support.

Emscripten

Looks like the sdk prefers to be run from the powershell. I saw the 'emsdk_env.ps1' so that would seem the more native windows setup to use. So after that 'emcc -v' in the powershell returned with a version and other info; that's good.
So now lets see if we can get a thing to build.

        `emcc example.cc -s WASM=1 -o example.wasm` 
example.cc:8:10: fatal error: 'complex.hh' file not found
8 | #include "complex.hh"

Doh, so that didn't work. Looks like the rest of the source needs to live with the example.cc that was included in Rich's git.
On a good note though ๐Ÿ‘this does mean that it's actually going to try to compile the thing into wasm, so that's a step in the right direction. I might just go to the source from aicodix and just build from that repo.
Also `wasm-ld: error: unknown file type: Makefile` so I guess emcc doesn't know what to make of the Makefile. ๐Ÿ˜” hmm...

More to come

I'll continue on this when I have a bit more energy, was at the gym for a while and I've yet to shower.

2024 01 22

Trying to figure out the entire gradle pipeline for building the Rattlegram android project. I'm not much of a gradle/android pro so I'm rather poking in the dark with this. The plan is to have something interesting to show. I'd idealy have a local pipeline where I can build the PWA locally and test between a few devices.
To that end I'm hoping to figure out how to get the current PWA working so I can start building the rest of the app around the wasm module.
Also, to better understand the cpp module that's built into the wasm, I was hoping to get around to re-writing it in rust since that's where wasm/web apps future lies (imo).

Also doing some more minor artwork on CodeWitches.

2024 01 20

Doing some Code Witches Art tonight.

2024 01 19

Rattlegram & Ribbit

Looks like im getting myself involved with this project. At one point I was about ready to write my own ham chat app, but looks like i'll just help out finish this one.

2024 01 17

Getting into some side-side projects?

Because why not.

So a few other things are stacking up on the queue. Some simple art project related stuff and then I'll be looking into contributing into some interesting open source projects (related to amateur radio). ORI has a few intersting projects going on, one of them is called. Ribbit a project that makes use of an HT on uhf/vhf with a cell phone. I thought the project was rather dormant, but I guess I'll be finding out more soon.

Dice

On the side-side note, I've been helping contribute to some logo designs and some UI design for a simple dice app. Though I'd like to be more prescriptive it's not my personal project so i'll avoid getting too personally involved.

Code Witches

Back to the main topic at hand! I'll be looking into doing a bunch of pixel art, and I think I came up with a nice way to make a mono-space font that features morse code. I'm thinking of a stack of pancakes under each character and figuring out how to fit all the dits and dahs into a sort of square block under each letter. So ya, I'm making a CodeWitches.ttf font because there needs to be one.

Rocket

On a coming trip to Japan I'm hoping to visit Rocket a ham radio store in Tokyo. Should be interesting, this seems to be one of the last holdouts of radio city akihabara that haven't gotten washed away by modern japan.

2024 01 16

Got the character beeping at you now.

The reply in this case is "t" so that makes the dialog repeat. I'm going to add in the text in the "waterfall" from the dialog text. Also at some point I'll need to do some art that will begin to approach final look and feel sort of stuff.

2024 01 14

cool, finally figured out a way to get the dialog menu working. The menu is starting to look a bit like a dynamic programming paradigm, which is interesting and also manages to keep things super flexible.

        {
            label: "w1cq .-- .----",
            text: "GM good morning,",
            keys: { e: 'continue', t: 'repeat' },
            funcs: {
                e: () => { this.setLine(this.lineindex+1); },
                t: () => { this.setLine(this.lineindex); },
            }
        },
    

The example is basically the "keys" are the letters that the class is listening for in the waterfall, then the funcs that match the keys are executed.

This shows the dialog responding to 'e' and then executing a scene change on the last e. One option is to merge the funcs into the keys object...

        {
            keys: {
                'e': { label: 'continue', func: ()=>{ /*code*/ } }
                't': { label: 'repeat', func: ()=>{ /*code*/ } }
            }
        }
    

Which does look nicer... so ya, i should do that before i actually write anything necessary. ok, so that worked out well. I have some alternative work that needs to be done tonight as well, so I should take a shower and switch modes.

Also added in the teletype class to the dialog. So now the dialog is beeped out when the dialog text is updated.

I have to make the letters appear in the waterfall from the automated CW as well as hide options till the teletyper is done sending CW. I should also do something with the text as it's coded out.

2024 01 12

Time for a three day weekend, and a lot of game dev, i hope. So long as I'm able to get through this dumb cold. I'm hoping to link all the different parts together. If i can get a way to transition from battle mode, to the stats screen to a dialog scene, that'd be great. But right now I can't think of a simple way to transition logically between scene types really easily.

Thinking of the logic, a lot of "timing" and stuff is all being done with set interval and set timeouts... Which is prone to having interruptions really break logic. So we need the script manager to tell the director which scenes and actors to populate the scene with. Also, i'll look into adding iambic modes like here which is also showing off ultimatic mode, which is new to me.

2024 01 11

Slight headache tonight. But i'll look into some sort of status update screen after a battle. Also, I should clean up some of the CSS in use on the page before too long. Also, I guess this is a different use of the upper split.

We'll use something like this to display XP gains and points till the next level. To that point though what stats will be in the box? 'XP' experience points, 'HP' hit points, 'MP' magic points. A callsign should be included. Then we'll need a menu to navigate through Equipment, use items, Select new spells, power up spells.
Also, i should add "fill bars" to the usual stats.

I should also add the numbers bubbling up.

Ok, so I got a number popping up out of the target. It's nothing too dramatic, but it's a start. Considering this is all just CSS and Javascript I'd say there's quite a lot going on here.

2024 01 10

So what shall we get to for today. Mode switching, so dialog, should lead to a "training" battle. so some static archery targets in a court yard. We can still show a dialog actor over the battle actors with the dialog box. I don't think the layouts overlap. Would be nice to have this work together.

The area with the battle menu can be swapped with the dialog menu, this could lead to picking what spells to start off with.

2024 01 09

Managed to get a rough template setup for the dialog view implemented tonight. The battle system is on the right where we have the character delivering dialog on the left. Ignore that all three witches look the same, they're all just placeholders for now.

Of course as the character talks, the dits&dahs will be flowing down the waterfall. Replies and decisions will all be in CW (of course). None of the dialog has been written yet so there's still a lot of work to do on the story side of things.

So far everything is pretty much raw javascript CSS and a single 'canvas' object for the waterfall. All the rest is pretty much as basic as possible. I'm also hoping to open source all of this later on for others to make mods.

TODO

  1. Need a system to switch from dialog to battle mode.
  2. Battle mode needs to end when last enemy is defeated.

Once the last enemy is gone, we'll want to have some celebration, then the mode will switch to either dialog, or a map view. But, before we can go to a navigation view we'll need to make one. The Scene will need to have a "map" scene and I guess the actor in the scene would be a marker of some kind that indicates where the player is, and then the map would need a grid square overlaid

2024 01 08

After work, i'll continue on the notes for the GDD and i'll start some work on the dialog scene setup since that's where the "tutorial" of the game will take place and the story mode will be revealed.

Then there's a bunch of work left for things like the navigation mode where the player will be presented with a map of some kind and a "grid" system to key in for moving from place to place. Then during the movement phase various encounters will take place between positions on the map where "Battle" mode will take over.

Eventually the game will be converted into a PWA or a single page progressive web app, example of what that can do is here so that's a long term goal. The cool thing about that is I'll eventually put the app on steam or something like that where the assets and resources for the game will be pre-loaded into a single package where the app will run stand-alone without any connection to the internet. This might require some additional work according to this.

Figuring out how far I can get with the character dialog stuff tonight.

Other stuff today

KE9V talks about the decline of active hams in the USA. This is sort of one of the motivations behind Code Witches. Perhaps a modern take on CW might spark some interest for new users.

I'm also looking into some tech for building a generalized distributed forum system for radio. The nature of something like 4chan as "4ham" would be interesting, leave and post comments on a frequency. Something I think would be an interesting challenge. So you'd leave a radio listening on a frequency sending a beacon on a set time interval. Listeners could find the beacon, and on an off-tempo time beat ask for posts in with like Olivia.
The reply would then send text formatted with a post id, content and @callsign. Then a client would then reply with post id content they're replying to to grow a thread.

2024 01 07

Just before Christmas I started a silly game idea to see if I could build a game around CW. I asked the bing AI if it could generate some 8 bit style game title screens using the letters CW in the title. It came up with some pretty hilarious and quite usable images for things like Charming Wave and Code Witches:

It's fairly amusing to me that I've managed to get as far as I have without any form of design document. So far everything has just been off-the-cuff as it were and I've just been writing code that made sense to get some basic stuff working. So to make things a bit more formal I'll also start on a design doc. I suppose having been involved with game dev since the 90s has given me an intuition to get this far without any preface.

On another note...

Picked up some cables to connect my radio to a new antenna. Also decided to start a simple blog updating some of the work I'm doing with some of my ham projects. Progress so far on CW is basic menu system laid out for the battle system. General setup of some graphics and directories. Next i'll be going into some of the systems that will be used for dialog and tutorials.

Code Witches Concept

Short Term Goals

Long Term Goals