senders.io - Blog senders.io's blog feed https://www.senders.io/ 2023 senders dot io - CC BY-SA 4.0 en-US 60 Sat, 18 Mar 2023 23:12:23 -0400 Sat, 18 Mar 2023 23:12:23 -0400 Music: A Tour de Chorus https://www.senders.io/blog/music/2023-03-18/ https://www.senders.io/blog/music/2023-03-18/index.html Sat, 18 Mar 2023 23:12:23 -0400

A Tour de Chorus

I've been talking a lot about chorus on my mastodon, like, A LOT. So I thought it would be fun to explore my chorus pedals a bit and present this information in some shareable way, since no one wants to listen to 18 minutes of audio in a row.

What's on display

So I have three chorus pedals to show off today:

  • Boss CE-20
  • Warlus Audio Julia
  • Multivox CB-1 Chorus Box

The other gear

I will be playing each of these pedals through my THR-100HD (see my previous music blog post in which I deep dive a bit into this amp). It's running on the crunch channel, just at the edge of breakup, with little to no reverb.

As for guitar. I am using my Reverend Descent RA Baritone

The demos

For each of these demos I will be playing the same loop (mostly). I recorded a loop into my Boss RC-3 to remove any playing bias towards the more warbely chorus tones, and to make it easier for me! Each demo is about 48s long (depending on how good my trimming was). I added a bit of EQ in post to cut out some digital hum introduced when pairing my CE-20 with my RC-3 so sorry about that...

Let's start with the clean tone:

This loop is something I had been noodling on all week, while on my chorus kick. I feel it's actually a decent demo because it calls on a lot of classic chorus sounds. Individually picked notes, bright open strings, and then at the end some Nirvana-like dark power chord picking. All classic chorus sounds to me.

CE-20

Next we can go through the CE-20. The CE-20 has 4 modes we'll be demoing, but there are a total of 6. We are demoing the "Rich", "Standard", "Dimensional D", and "CE-1" settings. I skipped the "Acoustic" and "Bass" settings as they've always felt like some slight EQ on the "standard" mode.

Standard

We can start with "Standard" as it's the most "boss chorus". Though I personally feel it lacks a bit of the bite the CE-1 and CE-2 offer. But it wouldn't surprise me if "Standard" was just a CE-2.

CE-20 Standard Mode - Rate 10 o'clock, Depth 2 o'clock
CE-20 Standard Mode - Rate 2 o'clock, Depth 10 o'clock

Rich

Let's compare this with the "Rich" mode. And keep in mind the only settings I will be changing between these CE-20 modes is the rate and depth. There is actually quite a bit of tone controlling you get in the CE-20. But I generally keep those fairly static based on my guitar and amp settings, and for the purposes of these demos are static.

CE-20 Rich Mode - Rate 10 o'clock, Depth 2 o'clock

Dimensional D

This mode is a recreation of the SDD-320 Dimension D effects unit, later made into the Boss DC-2. This effect is one of my favorite choruses. It's so unique. On the CE-20 there are 7 modes: 1 - 4, as well as 3 "combo" modes: 1+4, 2+4, and 3+4. These map directly to their SDD-320 counterparts, which also let you stack the modes together. This really shines in stereo, but since the Julia is mono, I felt it's only fair to use these how I use them on my board.

CE-20 Dimensional D Mode - Mode 3
CE-20 Dimensional D Mode - Mode 4
CE-20 Dimensional D Mode - Mode 3+4

These are always so cool to hear. When you get into the combo modes you start getting more "chorus" and less just "width/movement". But these are interesting to listen to compared to the clean. There is subtle differences - but they're there! It's almost like it is now less stark and smoother. Like the notes are lathered in butter, mmm!

CE -1

Okay, now on to the real show, the CE-1. Not much to say about this one. It's a CE-1, you have an "intensity" knob, and it's so rich. The delay rate is much slower than you would expect, almost logarithmic. But when you get past noon it starts to get quite seasick.
Editors note: 7 o'clock may be a bit higher than 7. None of these pedals have freaking numbers on their knobs, so it's all a guess. But it's a bit up from off

CE-20 CE-1 Mode - Intensity 7 o'clock
CE-20 CE-1 Mode - Intensity 10 o'clock

Multivox CB-1 Chorus Box

The Multivox CB-1 Chorus Box is a CE-1 clone, according to the gear page, it's literally just the same circuit and components. I got my 5 years ago because, well, I love chorus. Currently, a part of my rack unit to be used with my synths, this chorus is just so smooth. But the biggest trouble is dialing in the right level. You'll notice for the CB-1 demos it's a different demo recording. I had to move my setup and I accidentally wiped the RC-3. But because the CB-1 can be a bit tricky to dial in, it's a bit quieter than the other demo tracks. But the level control is one of my favorites, as it can add some crunch to the tone on the peaks, adding a lot of flavor. I am running my guitar through the "hi" input, because it gives me a bit more play with the input level.

Chorus

CB-1 Chorus Mode - Intensity 7 o'clock
CB-1 Chorus Mode - Intensity 10 o'clock
CB-1 Chorus Mode - Intensity 2 o'clock

Vibrato

While this is mono, so it's acting like a straight vibrato. When playing in stereo this creates it's own chorus, the stereo outs are "dry" and "wet". This differs from the CE-1 chorus too, so it's like 2 chorus pedals in one. These demos are in mono.

CB-1 Vibrato Mode - Rate 10 o'clock, Depth 2 o'clock
CB-1 Vibrato Mode - Rate 2 o'clock, Depth 10 o'clock

CB-1 Off with Level Boost

The CB-1 when over driven (just by the guitar itself) gets really warm crunch to signal, and it's a lot of fun. I usually run my Model D through this and I love it.

CB-1 Off - Level to a point where when I dig in it clips heavily

Walrus Audio Julia

I picked up the Julia because it's such a versatile chorus: giving you control over the rate, depth, lag, waveform, and mix. This lets you craft basically ANY chorus sound you want. Exploring sounds, I've noticed the major limiter being the rate. The Julia is just SO fast. Even at min rate, it's still faster than like 1/3 of the Boss rates. But the sounds are still amazing!

Julia - Triangle Wave, Rate 7 o'clock, Depth 2 o'clock, Lag 3 o'clock, Mix 12 o'clock (chorus)
Julia - Sine Wave, Rate 10 o'clock, Depth 2 o'clock, Lag 9 o'clock, Mix 12 o'clock (chorus)
Julia - Sine Wave, Rate 9 o'clock, Depth 3 o'clock, Lag 9 o'clock, Mix 12 o'clock (chorus)
Julia - Triangle Wave, Rate 8 o'clock, Depth 12 o'clock, Lag 12 o'clock, Mix 5 o'clock (vibrato, max)

Thoughts

Realistically? I love every single one of these choruses. It's such an amazing effect, and I was messing around with the dirty channel too, which still sounded great! The CB-1 was by no means a steal, but it's my favorite chorus tone. But it comes with some quirks being a late 70, early 80s device. The CE-20 is amazing but very much a "mid 00s digital pedal" giving some of that digital-ness to it, especially when mixing with other digital pedals. I'm sure you heard the high pitched wine in the background. I EQ'd it out, but it's there, and it bothers me. I think getting a CE-2w would give me a lot of the options I want from this, without those digital artifacts. The CE-20 would be perfect if it had a vibrato mode, given the CE-1 has one, and really make it the perfect all-in-one. But given I've had this pedal for at least 12 years (probably closer to 13. I can dig out the box and see if I kept the receipt). I got in in college as my first ever chorus. I was enamoured with it. I'd be on my board today if it wasn't so big. The Julia is the perfect multi-tool chorus, and I've been really happy with it. But it lacks that really SLOW rate that the Boss pedals have, making it a BIT harder to really dial in the CE-1 tones.

I joked on mastodon that I was did this to convince myself I don't NEED a CE-2w or DC-2w...and now I want them even more!

]]>
Music: Reworking my THR100HD https://www.senders.io/blog/music/2023-01-06/ https://www.senders.io/blog/music/2023-01-06/index.html Fri, 06 Jan 2023 00:00:00 -0500

Music Blog?!

I wanted to make a little blog section to just talk about my music making. Mainly, to save my friends from enduring my thinking out loud.

Reworking my THR100HD

I have a Yamaha THR100H Dual which is a nice modeling amp with two "amps". Typically, I run these in parallel so I am running through BOTH at the same time. As of late I am actually considering moving to dialing in separate tones, and using my Joyo PXL-Live to act as a "channel" switcher.

Dual Amping

Honestly, dual amping is my favorite thing. And I would hate to give it up, as it gives my tones SO much depth. But I find when I try to mix my guitars that extra depth just makes mixing a bit more of a hassle than need be. But Mick of "That Pedal Show" on YouTube I feel feels similarly, considering in one of their "use less" challenge videos he used two amps for maximum tone shaping - which I feel adds some justification to my efforts!

Results after one night

I spent an hour or so tonight messing around with my setup and came out with the following high gain tone:

"Rezzed" - Hi-gain dual amped Baritone guitar
No copyright

Thoughts

I feel its a bit... boomy still. There is some extra weight coming from the "clean" channel that I think is causing this to lose some clarity. I don't think if I wanted to add a mix around this I'd even end up keeping it. Or I would do some heavy EQing to that channel. Here is what I have dialed in so far:

A photo of the front face knobs of my Yamaha THR100HD. The top amp is set to the clean setting, the booster is turned off. The gain is roughly at 3 O'Clock, Master at 9 O'Clock, Bass at 10 O'Clock, Middle at 2 O'Clock, Presense off, Rever off, and Volume at 11 O'Clock. The bottom amp is set to Modern, with the booster turned off. The gain is set to around 2:30, Master at 10 O'Clock, Bass at a bit below 9 O'Clock, Middle at 2 O'Clock, Treble at 1 O'Clock, Presents at 1:30, Rever off, and Volume a little above 9 O'Clock
Current dual amp settings

Future

In the future I plan to setup different profiles between each the 5 channels per amp - so they're all useable and I can just do single amping - as that provides me the FX loop until I setup a proper stereo board. But until then - this is the setup I've been using and I rarely touch the back!

]]>
How I Generate My RSS Feed https://www.senders.io/blog/2023-01-06/ https://www.senders.io/blog/2023-01-06/index.html Fri, 06 Jan 2023 00:00:00 -0500

How I Generate My RSS Feed

I only just now started supplying an RSS feed to you fine people! You can subscribe to it at www.senders.io/blog/feed.rss!

I decided rather than manually generating the file contents I’d hook into my pre-existing publish scripts to be able to generate the RSS file.

Publishing blog posts - shell scripts ftw

In My Markdown -> HTML Setup I touch on how I publish my markdown files into HTML for this blog. But what I don’t really touch on is the shell scripts that tie the whole process together.

What I have is two, now three, scripts that feed the whole process:

  1. publish-blog.sh - the main script
  2. compile-md.sh - generates the HTML output
  3. update-feed.sh - generates/appends the RSS feed

The update-feed.sh script is the new one I just added.

publish-blog.sh is the primary interface, I supply the date of the post and the path to the md file and that calls compile and update to automate the entire process.

Without going into TOO much detail you can view the latest versions of the scripts at git.senders.io/senders/senders-io/tree/.

But the gist of the scripts is I parse out the necessary details, find/replace some tokens in template files I have setup for headers and footers, and concat the outputs into the final output HTML files, and now RSS feed.

update-feed.sh

Source File: git.senders.io/senders/senders-io/tree/update-feed.sh

This script is pretty interesting. I didn’t want to deal with any XML parsers and libraries to just maintain a proper XML rss file and push items into the tree. Rather, I just follow a similar setup to my markdown generation. I leverage some temporary files to hold the contents, a static temp file for the previously generated content, and at the end swap the temp file with the real file.

I take in an input of the publish date (this is the date from the publish script), the title, and the HTML file path. These are all already variables in the publish script, but also something I can manually supply if I need to publish an older article, or something I wrote directly in HTML.

The core of the script is found here:

PUBDATE=$(date -d "$1" -R)
TITLE=$2
FILE_PATH=$3
PERMALINK=$(echo "${FILE_PATH}" | sed -e "s,${TKN_URL_STRIP},${URL_PREFIX},g")
LINK=$(echo "${PERMALINK}" | sed -e "s,${TKN_INDEX_STRIP},,g")

# Generate TMP FEED File Header

cat -s $FILE_RSS_HEADER > $FILE_TMP_FEED
sed -i -E "s/${TKN_BUILDDATE}/${BUILDDATE}/g" $FILE_TMP_FEED
sed -i -E "s/${TKN_PUBDATE}/${PUBDATE}/g" $FILE_TMP_FEED

# Generate TMP Item File

cat -s $FILE_RSS_ITEM_HEADER > $FILE_TMP_ITEM
sed -i -E "s~${TKN_TITLE}~${TITLE}~g" $FILE_TMP_ITEM
sed -i -E "s/${TKN_PUBDATE}/${PUBDATE}/g" $FILE_TMP_ITEM
sed -i -E "s,${TKN_PERMALINK},${PERMALINK},g" $FILE_TMP_ITEM
sed -i -E "s,${TKN_LINK},${LINK},g" $FILE_TMP_ITEM
sed -n "/<article>/,/<\/article>/p" $FILE_PATH >> $FILE_TMP_ITEM
cat -s $FILE_RSS_ITEM_FOOTER >> $FILE_TMP_ITEM

# Prepend Item to items list and overwrite items file w/ prepended item
## In order to "prepend" the item (so it's on top of the others)
## We need to concat the tmp item file with the existing list, then
## we can push the contents over the existing file
## We use cat -s to squeeze the blank lines
cat -s $FILE_ITEM_OUTPUT >> $FILE_TMP_ITEM
cat -s $FILE_TMP_ITEM > $FILE_ITEM_OUTPUT

# Push items to TMP FEED
cat -s $FILE_ITEM_OUTPUT >> $FILE_TMP_FEED

# Push RSS footer to TMP FEED
cat -s $FILE_RSS_FOOTER >> $FILE_TMP_FEED
echo $FILE_TMP_FEED

# Publish feed
cat -s $FILE_TMP_FEED > $FILE_RSS_OUTPUT

echo "Finished generating feed"

Some key takeaways are:

  1. sed lets you do regex with delimiters that AREN’T / so you can substitute something that shouldn’t actually ever show up in your regex. For me that is ~.
  2. I always forget you can use sed to extract between tokens - which is how I get the CDATA for the RSS: sed -n "/<article>/,/<\/article>/p"
  3. mktemp is really REALLY useful - and I feel is under utilized in shellscripting

The obvious cracks are:

  1. I rely SO much on sed that it’s almost certainly going to break
  2. I don’t have much other flag control to do partial generation - so if I need to do something either starting partway through or not finish the full process, I don’t have that.
  3. Sometimes things can break silently and it will go through, there is no verification or like manual checking along the way before publishing the feed.rss

The final two can easily be managed by writing the feed to a location that isn’t a temp file and I can manually do the cat -s $FILE_TMP_FEED > www/blog/feed.rss myself after I check it over.

But for now I’ll see if I ever have to redo it. I don’t think anyone will actually sub to this so I don’t really need to care that much if I amend the feed.

Where to put the feed URL

I never intended to provide an RSS feed. I doubt anyone but me reads this, and from my previous experience with gemini feed generation was a bit of a headache.

A quick aside: I really only decided thanks to Mastodon. I was thinking during the Twitter meltdown “what if twitter but RSS” (I know super unique idea). But basically like a true “microblog”. And some OSS tools to publish your blog. This got me reading the RSS spec and looking into it more - which then lead me down the using the RSS readers more (in conjunction with gemini, and Cortex podcast talking about using RSS more).

But I’ve decided to just put the RSS feed in the blog index, on my homepage, and that’s it. I don’t need it permanently in the header.

Conclusion

I didn’t have much to share here, it doesn’t make too much sense to write a big post on what can be explained better by just checking out the shell scripts in my git source. The code speaks better than I ever could.

I really, really like shell scripting.

]]>
Music Spotlight: My Top Album 2022 https://www.senders.io/blog/2023-01-03/ https://www.senders.io/blog/2023-01-03/index.html Tue, 03 Jan 2023 00:00:00 -0500

Music Spotlight: My Top Album 2022

The hype is real. I only recently wrote last years, so I bet your hype is nonexistent but for me I was writing that knowing full well there were some bangers waiting to be unleashed in this year end review!

If you hadn’t read my previous post for 2021 the link is at the bottom:

The winner was “KANGA - You and I Will Never Die”

The album pool

As always the criteria:

  • it was released in 2022
  • it wasn’t a single
  • if it was an EP it has to be substantial and intentional

And the albums are…

  • Amining for Enrike - The Rats and the Children
  • And So I watch You from Afar - Jettison
  • Astronoid - Radiant Bloom
  • Carpenter Brut - Leather Terror
  • Cult of Luna - The Long Road North
  • Dance With the Dead - Driven to Madness
  • Elder - Innate Passage
  • Emma Ruth Rundle - EG2: Dowsing Voice
  • Giraffes? Giraffes! - Death Breath
  • God Mother - Obeveklig
  • Jay Hosking - Celestial spheres (and various other releases)
  • Long Distance Calling - Eraser
  • Ludovico Technique - Haunted People
  • MWWB - The Harvest (Mammoth Weed Wizard Bastard)
  • MØL - Diorama (Instrumental)
  • Psychostick - … and Stuff
  • Russian Circles - Gnosis
  • SIERRA - See Me Now
  • Starcadian - Shadowcatcher
  • Tina Dickow - Bitte Små Ryk
  • Toundra - Hex
  • Waveshaper - Forgotten Shapes

2022’s playlist (+ 2 albums from bandcamp not on Spotify):

The Top 5

In alphabetical order:

  • Carpenter Brut - Leather Terror
  • Elder - Innate Passage
  • Emma Ruth Rundle - EG2: Dowsing Voice
  • Jay Hosking - Celestial spheres (and various other releases)
  • Tina Dickow - Bitte Små Ryk

Carpenter Brut - Leather Terror

Some metal infused synthwave, Carpenter Brut managed to release a catchy and heavy banger of an album. Featuring a few guest performers, each of these tracks are unique and catchy in what I would consider a very “same-y” genre. It’s nice having an infinite supply of retro synth tracks to drive to, but sometimes it’s hard for one to really break through into “oh shit yes!”. Typically, Starcadian is the one to do that for me, as they add an extra layer to their tracks through their music videos (each track being an “ear movie”).

Throughout the year I found myself coming back to a few tracks over and over - especially when I was showering or doing some other short activity and I just wanted something upbeat and fun as heck!

Some call out featured songs are The Widow Maker featuring Gunship, Imaginary Fire featuring Greg Puciato, and Lipstick Masquerade featuring Persha. I looped these three songs quite a bit. But there are quite a few more to checkout.

Favorite Track

This is tough, as I looped those three songs quite a bit - each bringing their own unique energy. So I’ll pick all three - my list my rules:

  • The Widow maker - feat. Gunship This track is representative of the genre. It’s synthwave to the core.

  • Imaginary Fire - feat. Greg Puciato This is a metal track with synths. Greg Puciato (of The Dillinger Escape Plan fame) is one of my favorite vocalists and is immensely talented. This is probably my favorite because I can’t get enough of his vocal style - the screams and the clean vocals!

  • Lipstick Masquerade - feat. Persha This is a modern 80s track. This is what retrowave was designed around and while tracks like The Widow Maker are more typical of the genre, this is the song they all are basing their sound off of. This is kill pop song.

Special Commendation - Non Stop Bangers

You throw this album on and it hits you with just banger after banger. I can’t keep myself from dancing. Even as I listen back as I write this gemlog I am grooving in my chair! Like Kanga last year, this is just a series of tracks that just make you dance.

Album Link

[spotify] Carpenter Brut - Leather Terror

Elder - Innate Passage

I toot’d a bit about this album, a later release in the year, this took this year end review and flipped it on its head. I thought it was wrapped up already with a separate release this year, but this makes the decision so hard.

Elder came at us with what feels like a return to form. Having previously released Omens in 2020 and a collaboration album in 2021, Innate Passage takes the best parts of those two albums and builds on-top of more “classic Elder” albums like Lore. Elder has carved out their own niche in the genre making a blend of psych rock and stoner metal, with each release leaning harder and harder into psychedelic realms. Innate Passage has this almost ethereal feeling - especially in their opening track Catastasis.

I think, however, they’ve left the doom and stoner metal behind. Dead Roots Stirring and Elder (self titled) were certainly “Doomy” and in that “doom/stoner” metal overlap. Lore, Reflections of a Floating World are both still very “stoner metal”. But is playing psychedelic-metal with a big muff automatically stoner metal? I think since Omens they’re probably, as a band, firmly outside of the stoner metal field - and more soundly in some psychedelic/prog metal genre?

They introduce themselves as such in their website actually!

genre-pushing rock band that melds heavy psychedelic sounds with progressive elements and evocative soundscapes.

https://beholdtheelder.com/elder-bio/

“Merged In Dreams - Ne Plus Ultra” is the track that flips this whole argument on its head and shows that regardless, they’re still very much a metal band and one that you’ll absolutely be head banging too, horn up \m/.

Favorite Track

I think “Merged In Dreams - Ne Plus Ultra”. A nearly 15 minute track that has everything in it you expect from Elder.

Special Commendation - Excellent Vinyl Record Cover

I LOVE their record covers when they do the circular inserts. You can display this vinyl with having 3 separate views through the port, which while purely aesthetic - it’s very nice!

The quality of the vinyl release was great, though I find any non-black Vinyl has a 33% chance of being slightly warped upon arrival. I am going to stick to traditional black vinyls from now on sadly. It’s too freaking often

Album Link

[spotify] Elder - Innate Passage

Emma Ruth Rundle - EG2: Dowsing Voice

Her second album in her “Electric Guitar” series - Emma Ruth Rundle (ERR from here on out) has released “Dowsing Voice” a haunting follow-up to last years Engine of Hell. Holy holy HOLY hell, this album is an impactful, artistic, just WOW. It’s hard to describe. I was listening to it for this review and my partner, sitting behind me relaxing, said “What the hell are you listening too, this is scary!”. And scary, emotional, and difficult it is. ERR stretches the use of the “electric guitar” title, as the focus here is the additional layers and voices added on-top of the main tracks.

An experimental release that, at this time is only available on bandcamp, is one I don’t put on frequently, but when I do am fully captivated. If you like artistic records - please check this out.

Favorite Track

Probably: Keening into Ffynnon Llanllawer - I love the guitar(?) part and the wailing/vocalization. It’s haunting. As a recording is amazing.

Though “In the Cave of The Cailleach’s Death-Birth” is the /best/ track. Put some headphones on and give this a listen! Just amazing.

Special Commendation - Album Art

This album, IS ART, but the album art is just… really suiting the music.

Album Link

[bandcamp] Emma Ruth Rundle - EG2: Dowsing Voice

Jay Hosking - Celestial spheres (and various other releases)

This is an interesting pick. Having released JUST in time for this year, this is an album I have been engaging with in many, many ways. Firstly, I am a patron of this performer via Patreon. They make music videos (audio only performance videos of the songs) that they compile into albums. Last year’s album is probably my actual favorite and likely SHOULD’VE snuck into the top 5 because of the final track alone, which was an emotional and just epic banger of a track (Linked at the bottom of this review).

Celestial spheres is a compilation of 8 synth jams. Jay bills these as semi-improvisational, and while the YT channel is a synth nerds dream of these informative performances, the songs stand on their own. This one is no exception. Using various different pieces of hardware synths, grooveboxes, drum machines and traditional instruments - each track is unique while still carrying this /energy/ and style. It’s so easy to hear Jays tracks and know it’s him.

I’ve been following him for years and really enjoy the music he makes, and the community he’s built up around his music. Due to the disconnected nature of the singles (releasing effectively as YouTube videos prior to the album drop) it’s difficult to ultimately rate these in these lists since I don’t get a chance to really enjoy them /as an album/ until the end of the year (the past two times happened like this where they came out around the end of the year). And on my playlist “Future, Tense” is present as it’s a “2022” album according to Spotify, but was out on bandcamp in 2021, and that’s when I was gifted it by Jay.

So yeah - this whole section is like “disclaimer disclaimer” but if you like groovy, typically instrumental synth music - check it out.

The various other releases

This year Jay released a few albums actually which I didn’t want to include separately. If you enjoy this album (which was mostly comprised of 2022 music, so was the primary focus) check out the other albums:

https://jayhosking.bandcamp.com/album/cinematic-works https://jayhosking.bandcamp.com/album/away-music-for-a-productive-day https://jayhosking.bandcamp.com/album/home-music-for-a-productive-day

Favorite Track

Without out a doubt it’s Nychthemeron. It’s truly a wild track, with so much happening in it. I suspect it was his favorite too since he made an actual music video for it:

[youtube] Jay Hosking - Nychthemeron (Official Music video)

Special Commendation - Each track has a live performance attached to it!

If you enjoy videos - these each have a corresponding YT video linked at the bottom of the bandcamp page.

Album Link

[bandcamp] Jay Hosking - Celestial spheres

Tina Dickow - Bitte Små Ryk

Tina Dickow (sometimes credited as Tina Dico, depending on the release) is a fantastic Danish singer songwriter. Since her first solo album she’s really found a way to elevate what is just folk indie pop. Her songwriting, arrangements, and performances are always so rich. She knows when to strip the song back - like Chefen Skal Ha' Fri - while, has certainly a lot happening beneath the lyrics - mixes them back a bit to let the layered vocals cut through as the song builds. Each song has so much to listen to! Picking out various instruments, layers, yet every song would work performed just her and her acoustic guitar. I find her style of pop music to be very engaging for that reason. I don’t often listen to this style of music, but the production behind each track is so good it hooks me in. That and her beautiful voice - which drew me in first.

It’s a bit harder to talk about this album given the language barrier (I do not speak Danish!) Which is a shame, since her lyrics are often what I love about some of her previous albums. I’ve read the translations and done my own as a learning exercise, but there is a layer missing which is a shame given how strong this album is as whole.

I’ve spoken about Tina before in two previous gemlogs (Music Spotlight: Awesome EPs and 5x5 Playlists (both gemini:// links)) and is one of my absolute favorite artists of all time. I’ve been slowly collecting her entire discography, which can be tricky, given a lot of copies are out of print and the remaining stock/used copies are often in Europe. (And that 5x5 playlist is very telling given most of those artists have been featured in my top albums lists and were winners! Is this foreshadowing?!)

Favorite Track

I shouldn’t have introduced this section - it has been so hard each time! I think the title track, Bitte Små Ryk. It’s got everything there, and is representative of the albums sound.

Special Commendation - Lovely

This whole album is lovely. There is emotion here too, and while I don’t speak the language its often very clear. But I love Tina and her music. It’s lovely and hits this spot in me thats just warm.

Album Link

[spotify] Tina Dickow - Bitte Små Ryk

My Top Pick

This year has been especially hard, since I spent so much time listening to 2021s releases which are some of my favorite of all time. And between 2021 and 2022 (and mentioned in my 2021 spotlight) nearly every one of my favorite artists released an album. So I have been blessed with a lot to listen to.

Anyone following me on mastodon may have seen Tina Dickow just owning my entire wrapped campaign, but with Elder releasing their album after the data collection stops for wrapped, that certainly isn’t telling the whole story.

And it wouldn’t be a top album list if I didn’t mention Starcadian being consistently in the top 10 year after year, just narrowly missing the top 5 - though technically, this release was in my 2020s list, as it was available then, but had since been pulled, and was released “officially” in 2022. Looking at what I can see it’s the same tracklist, but the “inspired by” credits are entirely gone from the 2022 release.

Elder - Innate Passage

Each year picking the winner is hard. Part of the reason I do this is I don’t really add stuff to the list I don’t like. A LOT of music comes out each year, and I add what I listen to. I don’t listen to music I don’t like - so by nature of the process - each album is a “top album” for me.

But the top 5 is usually a mix of “omg obvs” and “yeah turns out I threw that on way more than I expected” (Carpenter Brut). But its really always a fight between those “obvs” - this year was Elder and Tina Dickow. Their releases were seriously top tier and repeat listens.

Tina came in with the advantage of releasing in April, and Elder JUST released theirs at the end of November. But I did some math on my mastodon breaking down the comparison. Elder came at us with a longer albums, under half as many tracks, and over 2x the average song length (about 10min/track).

They didn’t waste a single second (neither did Tina) but just being such an accessible album - just direct pure energy and power - BOOM! It was great.

This should’ve been a tie

Honestly, I was ready to call it a tie. I am actually writing this minutes before posting it, because that’s how undecided I am and how close this is.

Tina Dickow deserves the number one slot any other year, and both her and Elder’s albums I hope to see more of in the next few years! Both are classic albums in their discographies (both albums of which I own and spin regularly). I forced myself to pick, and just knowing me, my tastes, and all the stuff I said above - I went with Elder. But seriously, listen to this record - Tina manages to pack so much musicality in carving out a unique sound and just amazing style. I love her <3 :)

And if her music isn’t your jam - check out her guest tracks on the Zero-7 stuff - angelic voice.

Conclusion

I am REALLY disappointed I had to choose between Elder and Tina Dickow this year. Similarly, last year I had Raised by Swans, ERR, and Kanga! And our winner in 2020 was Bell Witch. These ARE my top six favorite musical artists currently active.

I’ll talk about music trends and my tastes later on. But I just wanted to emphasize how much of a banger these last 3 years have been musically and I am grateful I get to share these with you here.

I am really excited for 2023!

This year’s playlist (2023)

[spotify] senders' Releases 2023 Playlist

Links

If you use gemini:// you can check out my previous posts (until/unless I decided to port those over too)

Thanks for reading! I don’t always crosspost - I am trying something out :)

]]>
RSS - A Follow-up https://www.senders.io/blog/2022-12-31/ https://www.senders.io/blog/2022-12-31/index.html Sat, 31 Dec 2022 00:00:00 -0500

RSS - A Follow-up

Get an RSS reader and connect everything to it!

Between switching to Mastodon for my social media allowance, and using a dedicated RSS reader has really cut down my overall consumption and wasted PC time.

this blogpost is originally posted to my gemini gemlog: gemini://senders.io/gemlog/2022-12-31-rss-a-follow-up.gmi which is where I do most of my writing, converting some useful to share things over here. It is also where the original RSS gemlog this is a follow-up to was posted. For context, I wanted to cutback on a lot of my web consumption, wasting time and just being mindless online. So I looked to RSS to help centralize and solve this issue.

Recap

So I am using https://tt-rss.org/ as my RSS aggregator. It’s a self-hosted RSS aggregator that, using profiles, allows you to subscribe to multiple feeds and have them “synced” between multiple devices (they’re not synced, you’re connecting to a central server). I like this because I don’t ever have to worry about dismissing, reading, or marking anything on my phone to have it still present on my PC. And I don’t have to worry about feed subscriptions or my phone pinging a bunch of feeds, or obviously, any third-party hosting.

How I’ve been using it

So as always, please send me interesting RSS feeds! Or even your own! I am trying to read more blogs, and if you have something you enjoy drop me a DM or email! I’ll share what I am following throughout this section <3

Blogs

Obviously, I am following blogs, one of the last holdouts of RSS. I have a few that I follow, mostly other transfolk on Mastodon that I found had their own blogs. Most non-trans folks I follow are using gemini and still rely on the feed aggregators for that.

If you’re interested the two main ones I am reading right now are:

  1. Erin In The Morn (substack)
  2. Selfaware Soup

Which have been pretty insightful. Erin sharing a lot of US transgender news, which is good since I have dropped off using Reddit which is where I “got” my “news” from.

Podcasts

The other mainstay in RSS is podcasts. Some even say if a podcast can’t be consumed via RSS, is it even a podcast? I would agree. Everything else is just a show. I don’t need the content to be consumable from my reader, but I’d really appreciate it if were. I am always on the lookout for more podcasts though. With the only two consistent listens being:

  1. The Pen Addict Podcast (relay.fm)
  2. Cortex Podcast (relay.fm)

And currently off-season:

Which has a YouTube video format. Though, I honestly really don’t care for Austin Evans, I just enjoy consuming some F1 content and pretending I have friends I can talk to about motor racing.

While writing this section I added:

I have yet to listen, some of the topics seem interesting and being infrequent gives me hope its quality over quantity. (And I like having podcasts for chores to distract my brain)

Tech News

Right now I follow two main news sources in tech:

  1. debian.org/news
  2. LWN.net

Running servers using stable debian - it’s good to know when security updates come in, as well as distro updates. And LWN is fantastic, I’ve been a subscriber for many years and while sometimes (Jake) can focus a bit heavy on Python news, has been always interesting to read.

This is the section I plan on adding more and more to. I had other tech blogs that just felt like clutter and were pushing out daily articles that I couldn’t care less about (opensource.com cough cough). But that’s just me. Tech news is mainly where I want to focus - since fluff blogs are rarely my cup of tea.

LWN has some links in their weekly editions for other news feeds I might consider directly subscribing too, but for now I have these.

Music News

Some folk have an RSS feed for their site updates, which I appreciate. Some use sites like Squarespace but don’t properly connect up the RSS feed which I do NOT appreciate.

So right now I have two bandsites that DO update it seems (as their site aligns with the feed) - but the only one I’ll mention is: raisedbyswans.com I’ve spoken of this artist in my Music Spotlight MANY times and is one of my favorites. His site, while entirely simple, is setup with RSS and has been publishing his updates consistently. I appreciate this. Always a strong rec from me!

I’ve been toying with Music Review sites that talk about new releases in the genres they specialize in, but I haven’t settled on anything that is helping me discover new music.

YouTube

This is probably where the biggest change has actually come in. Having my YouTube feed fed through RSS has been fantastic. I am able to not only refresh and not miss any updates (since YouTube sometimes likes to pull updates in out of order than I don’t see it because it’s buried between some other videos that I’d already seen.

But this also allows me one further level of filtering on my YouTube subscriptions. I can stay subscribed to channels I am interested in watching occasionally but not every video, and keep those off my RSS feed. And for the “I like to watch most if not all the new videos” I can subscribe to those via RSS. So it’s like the “bell” but without the app basically. And since on Mobile I do NOT use the YouTube app (so I can take advantage of the Ad Blocker in Firefox) that’s great!

What sucks / is tricky is actually subscribing to the RSS feeds because YouTube buried that feature now. You just need the channel_id or the username and you can subscribe using the following URL:

https://www.youtube.com/feeds/videos.xml?channel_id={ID}

And you can obtain the channel_id either using the URL (though with aliases now (@channelname) its rare to see a channel_id in the URL) if present otherwise a little console JS can print it out:

ytInitialData.metadata.channelMetadataRenderer.externalId

A note however - you’ll need to clear the console if you navigate to the next channel, at least in Firefox, it caches the result otherwise and you’ll print out the duplicate value. There are some tools where you can print your subscribers list into these feed URLs and bulk subscribe. I’ve lost the link (and it’s what I did initially) but I recommend doing the manual add at least to focus on the channels you WANT in RSS, since you can always fallback to the main subscriptions page on YouTube.

But what this has given me is the ability to effectively ignore YouTube almost entirely. Ideally, I’d script something with YouTube-dl but I don’t REALLY care that much, and I’ve gotten into the habit of closing the tab after the video so I don’t stick around and get sucked into the algorithm.

What my morning looks like is sitting down, switching to my tt-rss tab, seeing what’s fresh, and watching a video with my coffee maybe, then just moving on and doing something else. I still lurk Mastodon, or get sucked into my computer in some way or another, but it’s been really positive! I can count on one hand how many times since dedicating to RSS I’ve just clicked around YouTube.

Hobby

The last section which really is an extension of Blogs/News is “hobby” RSS feeds. These feed a bit into the consumerist side of life and why I keep them separate. Right now it’s almost entirely fountain pen related (Who'da thought this community would still be writing blogs :P) but since most of the blog posts are either about products or reviews in some way, I try and limit how much I expose myself to them. I have been working on a draft about consumerism for quite a while now and just haven’t really worked it into a post that isn’t just DAE consumerism BAD? low-effort Toot level. (But basically, I kinda hate how all my hobbies, and hobbies in general rely heavily on a consumerism mindset, GAS, and such). So I’ve been trying to be more appreciative of what I already have and such.

But these blogs are nice, and often keep in the know about my hobbies and can react to anything meaningful that’s being released. A good video sorta on this topic was by Adam Neely(Adam Neely - How In-Ear Monitors are Making Better Musicians), and how his band spend $6000 on gear for their tour, but what it did was eliminate stress and enable them to more easily fine tune and control how they monitor their live performance. He touches on the fact that gear videos feed into the consumerist mindset of music making, but gear is often necessary to facilitate certain things, and setting up a portable in-ear-monitor rig for their entire band is well… unavoidable. It’s just a minor aside in a much deeper video about IEMs and touring and FEEL. And quite the departure from his usual music education content. But it sums up the main thesis of my consumerism gemlog quite nicely I feel (or at least I am projecting my thoughts into a brief aside he makes).

tt-rss - in retrospect

So tt-rss is fine honestly, I think I need to setup a better theme, something that has a bit more contrast. I don’t REALLY read in it, I just use it as the aggregator and then open the links directly. I don’t mind the way it renders the full articles with images, but I do mind how GREY it is by default (in “night” theme). It looks totally customizable and I bet I can download a decent theme for it if I look. But I may spend some time doing that and try and read more in application.

But other than that it’s been quite the improvement over my internet experience. More RSS!!

Conclusion

I need more feeds, as I do enjoy reading. So I’m always on the look out. I hate to throw in engagement-y things like “let me know” stuff but I am genuinely looking for interesting suggestions for stuff you might subscribe to over RSS. Even if it’s just “this is my webblog” :) I always like reading people’s things. I should troll the aggregators and look at folks capsule landings to see what is linked!

Anyway, you should look into getting an RSS aggregator setup. It’s been really impactful on cutting down on internet scrolling and mindlessness.

]]>
RSS - A Follow-up https://www.senders.io/blog/2022-12-31/ https://www.senders.io/blog/2022-12-31/index.html Sat, 31 Dec 2022 00:00:00 -0500

RSS - A Follow-up

Get an RSS reader and connect everything to it!

Between switching to Mastodon for my social media allowance, and using a dedicated RSS reader has really cut down my overall consumption and wasted PC time.

this blogpost is originally posted to my gemini gemlog: gemini://senders.io/gemlog/2022-12-31-rss-a-follow-up.gmi which is where I do most of my writing, converting some useful to share things over here. It is also where the original RSS gemlog this is a follow-up to was posted. For context, I wanted to cutback on a lot of my web consumption, wasting time and just being mindless online. So I looked to RSS to help centralize and solve this issue.

Recap

So I am using https://tt-rss.org/ as my RSS aggregator. It’s a self-hosted RSS aggregator that, using profiles, allows you to subscribe to multiple feeds and have them “synced” between multiple devices (they’re not synced, you’re connecting to a central server). I like this because I don’t ever have to worry about dismissing, reading, or marking anything on my phone to have it still present on my PC. And I don’t have to worry about feed subscriptions or my phone pinging a bunch of feeds, or obviously, any third-party hosting.

How I’ve been using it

So as always, please send me interesting RSS feeds! Or even your own! I am trying to read more blogs, and if you have something you enjoy drop me a DM or email! I’ll share what I am following throughout this section <3

Blogs

Obviously, I am following blogs, one of the last holdouts of RSS. I have a few that I follow, mostly other transfolk on Mastodon that I found had their own blogs. Most non-trans folks I follow are using gemini and still rely on the feed aggregators for that.

If you’re interested the two main ones I am reading right now are:

  1. Erin In The Morn (substack)
  2. Selfaware Soup

Which have been pretty insightful. Erin sharing a lot of US transgender news, which is good since I have dropped off using Reddit which is where I “got” my “news” from.

Podcasts

The other mainstay in RSS is podcasts. Some even say if a podcast can’t be consumed via RSS, is it even a podcast? I would agree. Everything else is just a show. I don’t need the content to be consumable from my reader, but I’d really appreciate it if were. I am always on the lookout for more podcasts though. With the only two consistent listens being:

  1. The Pen Addict Podcast (relay.fm)
  2. Cortex Podcast (relay.fm)

And currently off-season:

Which has a YouTube video format. Though, I honestly really don’t care for Austin Evans, I just enjoy consuming some F1 content and pretending I have friends I can talk to about motor racing.

While writing this section I added:

I have yet to listen, some of the topics seem interesting and being infrequent gives me hope its quality over quantity. (And I like having podcasts for chores to distract my brain)

Tech News

Right now I follow two main news sources in tech:

  1. debian.org/news
  2. LWN.net

Running servers using stable debian - it’s good to know when security updates come in, as well as distro updates. And LWN is fantastic, I’ve been a subscriber for many years and while sometimes (Jake) can focus a bit heavy on Python news, has been always interesting to read.

This is the section I plan on adding more and more to. I had other tech blogs that just felt like clutter and were pushing out daily articles that I couldn’t care less about (opensource.com cough cough). But that’s just me. Tech news is mainly where I want to focus - since fluff blogs are rarely my cup of tea.

LWN has some links in their weekly editions for other news feeds I might consider directly subscribing too, but for now I have these.

Music News

Some folk have an RSS feed for their site updates, which I appreciate. Some use sites like Squarespace but don’t properly connect up the RSS feed which I do NOT appreciate.

So right now I have two bandsites that DO update it seems (as their site aligns with the feed) - but the only one I’ll mention is: raisedbyswans.com I’ve spoken of this artist in my Music Spotlight MANY times and is one of my favorites. His site, while entirely simple, is setup with RSS and has been publishing his updates consistently. I appreciate this. Always a strong rec from me!

I’ve been toying with Music Review sites that talk about new releases in the genres they specialize in, but I haven’t settled on anything that is helping me discover new music.

YouTube

This is probably where the biggest change has actually come in. Having my YouTube feed fed through RSS has been fantastic. I am able to not only refresh and not miss any updates (since YouTube sometimes likes to pull updates in out of order than I don’t see it because it’s buried between some other videos that I’d already seen.

But this also allows me one further level of filtering on my YouTube subscriptions. I can stay subscribed to channels I am interested in watching occasionally but not every video, and keep those off my RSS feed. And for the “I like to watch most if not all the new videos” I can subscribe to those via RSS. So it’s like the “bell” but without the app basically. And since on Mobile I do NOT use the YouTube app (so I can take advantage of the Ad Blocker in Firefox) that’s great!

What sucks / is tricky is actually subscribing to the RSS feeds because YouTube buried that feature now. You just need the channel_id or the username and you can subscribe using the following URL:

https://www.youtube.com/feeds/videos.xml?channel_id={ID}

And you can obtain the channel_id either using the URL (though with aliases now (@channelname) its rare to see a channel_id in the URL) if present otherwise a little console JS can print it out:

ytInitialData.metadata.channelMetadataRenderer.externalId

A note however - you’ll need to clear the console if you navigate to the next channel, at least in Firefox, it caches the result otherwise and you’ll print out the duplicate value. There are some tools where you can print your subscribers list into these feed URLs and bulk subscribe. I’ve lost the link (and it’s what I did initially) but I recommend doing the manual add at least to focus on the channels you WANT in RSS, since you can always fallback to the main subscriptions page on YouTube.

But what this has given me is the ability to effectively ignore YouTube almost entirely. Ideally, I’d script something with YouTube-dl but I don’t REALLY care that much, and I’ve gotten into the habit of closing the tab after the video so I don’t stick around and get sucked into the algorithm.

What my morning looks like is sitting down, switching to my tt-rss tab, seeing what’s fresh, and watching a video with my coffee maybe, then just moving on and doing something else. I still lurk Mastodon, or get sucked into my computer in some way or another, but it’s been really positive! I can count on one hand how many times since dedicating to RSS I’ve just clicked around YouTube.

Hobby

The last section which really is an extension of Blogs/News is “hobby” RSS feeds. These feed a bit into the consumerist side of life and why I keep them separate. Right now it’s almost entirely fountain pen related (Who'da thought this community would still be writing blogs :P) but since most of the blog posts are either about products or reviews in some way, I try and limit how much I expose myself to them. I have been working on a draft about consumerism for quite a while now and just haven’t really worked it into a post that isn’t just DAE consumerism BAD? low-effort Toot level. (But basically, I kinda hate how all my hobbies, and hobbies in general rely heavily on a consumerism mindset, GAS, and such). So I’ve been trying to be more appreciative of what I already have and such.

But these blogs are nice, and often keep in the know about my hobbies and can react to anything meaningful that’s being released. A good video sorta on this topic was by Adam Neely(Adam Neely - How In-Ear Monitors are Making Better Musicians), and how his band spend $6000 on gear for their tour, but what it did was eliminate stress and enable them to more easily fine tune and control how they monitor their live performance. He touches on the fact that gear videos feed into the consumerist mindset of music making, but gear is often necessary to facilitate certain things, and setting up a portable in-ear-monitor rig for their entire band is well… unavoidable. It’s just a minor aside in a much deeper video about IEMs and touring and FEEL. And quite the departure from his usual music education content. But it sums up the main thesis of my consumerism gemlog quite nicely I feel (or at least I am projecting my thoughts into a brief aside he makes).

tt-rss - in retrospect

So tt-rss is fine honestly, I think I need to setup a better theme, something that has a bit more contrast. I don’t REALLY read in it, I just use it as the aggregator and then open the links directly. I don’t mind the way it renders the full articles with images, but I do mind how GREY it is by default (in “night” theme). It looks totally customizable and I bet I can download a decent theme for it if I look. But I may spend some time doing that and try and read more in application.

But other than that it’s been quite the improvement over my internet experience. More RSS!!

Conclusion

I need more feeds, as I do enjoy reading. So I’m always on the look out. I hate to throw in engagement-y things like “let me know” stuff but I am genuinely looking for interesting suggestions for stuff you might subscribe to over RSS. Even if it’s just “this is my webblog” :) I always like reading people’s things. I should troll the aggregators and look at folks capsule landings to see what is linked!

Anyway, you should look into getting an RSS aggregator setup. It’s been really impactful on cutting down on internet scrolling and mindlessness.

]]>
CSS Themes Exist Now!? https://www.senders.io/blog/2022-12-05/ https://www.senders.io/blog/2022-12-05/index.html Mon, 05 Dec 2022 00:00:00 -0500

CSS Themes Exist Now!?

Yeah news to me too! Seems like according to the MDN it’s been supported since 2019 for most browsers and supported by all by now.

This is so wild!

Why is this cool?

Well you may have noticed this is in dark mode now (if you set your preferences to dark in your OS/Browser). But this is cool because it means we’re no longer restricted to using Javascript and custom preferences for websites.

I had assumed this existed because sites like GitHub were defaulting to darkmode despite me never setting anything in like my profile settings. But I just assumed based off of my legacy knowledge this was some custom render trick using javascript.

Still no JS!

I keep this blog JS free! While not all pages under the senders.io umbrella are javascript free - everything in www.senders.io (this blog) will always be.

I try to keep that, not only for my sake, but for your sake too - a javascript free blog means the priority is reading.

Examples

So I achieve darkmode in this blog by doing the following:

/* default / light */
:root {
  --background: white;
  --font: black;
  --quote: #eee;
  --link: #0303ee;
  --linkv: #551a8b;
  --linkf: #f02727;
  --articleborder: #060606;
  --tableborder: #aaa;
  --tablehead: #ebcfff;
  --tablez: #eee;
}
@media (prefers-color-scheme: dark) {
  :root {
    --background: #1e1e1e;
    --font: #eee;
    --quote: #444;
    --link: #00d3d3;
    --linkv: #cd78f4;
    --linkf: #f02727;
    --articleborder: #23ed9b;
    --tableborder: #aaa;
    --tablehead: #6f5a7e;
    --tablez: #313131;
  }
}

Essentially, I leverage CSS Variables to define the specific areas I set theme specific colors (my nav bar is static regardless of dark/light mode for example).

Then if the media preference is dark - I overwrite the variables with my dark mode values!

Whats tricky is originally most of these values didn’t actually HAVE values set - I relied on the system default for things like links and the page colors in an effort to use minimum CSS as well.

I still feel like I am honoring that since I don’t have to duplicate any actual CSS this way, I just have a lookup table of color values.

That being said my CSS file is still only about 3kB which is not so bad. And I’ve actually covered most themed properties already - links, tables, quotes.

Toggling Themes

Something else I found out during this experiment is you can actually toggle the themes directly in your developer tooling. By opening your devtools and going to Inspector (in firefox at least) there are two buttons in the styles section “toggle light color scheme” and “toggle dark color scheme” using a sun and moon icon.

This made testing VERY easy and actually is what I noticed to prompt me into looking up if this was a standard CSS thing or not. So thanks Mozilla!

Conclusion

Yeah if you’ve never realized this check out the MDN guides on both variables (I didn’t realize these got put in the standard either!) and themes!

]]>
My Markdown -> HTML Setup https://www.senders.io/blog/2022-11-06/ https://www.senders.io/blog/2022-11-06/index.html Sun, 06 Nov 2022 00:00:00 -0400

My Markdown -> HTML Setup

A common way I see a lot of people blog, especially micro-blog, is in markdown.

Markdown is a lightweight markup language for creating formatted text using a plain-text editor.

Wikipedia | Markdown

It built itself on-top of common syntax prevalent on the web and was designed to be converted into simple HTML output. Since it leveraged preexisting syntax it was easy for new users to pick up, and is now found all over the web and applications.

Since I started this website, I had been writing each page by hand using a few tools to facilitate that - and for a while I had been looking for a good way to try out using markdown to generate some lighter pages and these blogposts.

Writing HTML by hand

When it comes to blogging a lot of platforms offer WYSIWYG editor – allowing users to write in rich-text that then gets converted into HTML in the style of the platform. But for my case, since I self host this website, I decided to stick to my roots and write PURE HTML instead.

HTML is fairly simple and easy once you get use to the basic structure of the system. And since I’ve been working in HTML almost two decades now, at the time it felt like the best solution to make a clean website.

I briefly touched on my design process in 2019-01-21 - First! A New Years Resolution outlining that I wanted to make a very lightweight and simple website. And at the time I believed the best way to achieve this goal was to carefully structure and craft my website’s HTML by hand.

This article is making the process sound far more difficult than it is – it’s mostly just tedious.

<article>
<h2> Title </h2>
<p>
   Some paragraph....
</p>
<h3>
<p> some subsection </p>
</h3>
<p> more text </p>
... etc

Is essentially what the website looks like - you can view the source of this page to see – it’s very simple HTML.

The benefit I found doing this, mostly leveraging tidy, allowed a very easy to edit codebase. And by leveraging the existing tags and their properties I also attempted to keep the styling to an absolute minimum. Using existing tags to enforce the styling I desired.

Only for certain areas (tables, code, quotes) where readability is an issue do I setup custom CSS.

Most of this process is actually what will continue to happen but the actual writing process will be unobstructed by the tedium of writing HTML.

Micro-blogging in general

At the time of writing this, I have no ported over any of my Gemini micro-blogs. This warrants a longer post, since I wrote consistently in gemini from March 2021 through May 2021 – having only stopped due to a long move leading to a lot of server downtime breaking the habit. My gemini updated multiple days a week - mostly due to the extremely lightweight and limited nature of the platform.

Gemtext

Gemtext was the gemini protocol’s standard MIME type. It was a basic markup language that relied on line based syntax. It was purposefully as lean as necessary because this was what was ACTUALLY being served to clients – unlike Markdown which first needed to be converted to HTML, gemtext was the actual text served and rendered on the viewers client. You could customize the style of your client - but you could not, as an author, dictate how your content would be viewed. This meant the only aspects of your blog you had control over was the actual content and it’s structure – which for a blog is really all you should care about.

It’s syntax contained most of what I was actually using here already from HTML:

  1. headings
  2. paragraphs that were wrapped based on page-width
  3. links
  4. lists
  5. quotes
  6. preformatted-text / codeblocks

Besides links - it also leveraged the same common syntaxes that markdown did.

Gemtext links

From my brief time in the IRC and in geminispace in general - a lot of the “recommendations” came from new users about providing in-line links. The philosophy was that by forcing links to exist on their own line - clients could configure how they wanted these to be seen and not have to worry about links interfering with the text.

Like Gopher (and unlike Markdown or HTML), Gemtext only lets you put links to other documents on a line of their own. You can’t make a single word in the middle of a sentence into a link. This takes a little getting used to, but it means that links are extremely easy to find, and clients can style them differently (e.g. to make it clear which protocol they use, or to display the domain name to help users decide whether they want to follow them or not) without interfering with the readability of your actual textual content.

gemini.circumlunar.space – A quick introduction to “gemtext” markup | Links

I felt that this provided a lot of useful limitations that removed a huge barrier for me to actually write down ideas without feeling over burdened. I also lurked in the IRC - as well as implemented my own gemini server.

As a quick aside – the java server was a lot of fun! The protocol was very simple to work with for basic gemtext. I felt the ultimate downside was trying to build something for basic gemini capsule hosting (like I was using for a decent chunk of my time with gemini) - and something for developers to use as a base application server. At the time in 2021 a lot of talk was happening on IRC of users starting to look to provide more complex experiences via the protocol and I wanted a way for those interactions to be built out in Java - since most were in Go or Python at the time. This decision lead to me burning out due to difficulties splitting those responsiblities easily - where you could host along side your application - since I lacked the experience with more complex Gemini capsule applications.

But it was a good experience and I got hands on experience with Certs, Netty, and SNI - which actually came in handy at my job!

Wasn’t this about Markdown?

A lot of what I liked about Gemini I found missing when I returned to the World Wide Web. Writing a new post was tedious and I actually had a few drafts sitting unposted. They’re probably checked into my git at this moment! So I thought - why not just use markdown and convert to HTML? That’s what it’s built for - and I already designed my site to work with minimal customization of raw HTML tags!

How I use Markdown

Firstly, this blogpost was written in Markdown (with minimal HTML sprinkled in). Then I render the markdown into HTML using Discount. Frankly, I don’t know how I stumbled across this markdown parser - I think it came pre-installed on my KDE Arch system because another KDE program used it. But I liked it, and it seemed extensible enough for my needs.

This would produce the “body” of my articles - and I could then prepend and append the template-head and foot to my html output to form a blog post/web page.

Customizations

After I generated the output file, I replaced some placeholders in the templates via sed and then tidy’d the HTML. The only other major issue was Discount had no way of appending any link attributes – so for external links I had sed append the rel and target attributes - which work off the assumption they’re not there. A lot of my home-server scripts rely on assumptions…

This is all bundled up in a simple script file so I can just supply a few arguments and the full page is re-rendered on command.

Two Sources of Truth

In the sytem I devised the markdown files are really the “source of truth” but you could argue that the HTML files hold equal weigh - as they’re what you’re reading right now. The markdown is only useful if I render it as HTML. There exist nginx extensions to serve markdown as HTML so I store everything as markdown. I could also provide some heading information to the markdowns to remove the command arguments and have on boot it generate the .html files in place before launching the site… But these are all nice ideas for a later date.

Ultimately, this is something I contribute to ocassionally - I don’t need something too complicated. I just need to output some HTML a few times a year. So if I manually publish the HTML each time - that’s likely far more efficent then re-rendering.

Learnings

This is the first post that uses this - though I’ve converted a page over to this already. But once I worked out the kinks and built a flow that works for me - this made the writing process a LOT easier. Another issue was that once I tidy’d the HTML file - it became frustrating to edit, and I didn’t always re-tidy it. Because the output is always tidy’d by the script - I can edit the raw markdown as needed. And the script generally will always output the same file (with whatever changes I made of course). This makes the editing and git history a lot clearer.

I would recommend writing in markdown - or even trying out gemini - you can host your gemini capsule on the web even! (Most gemini webpages are gemini capsules converted). I am sure other “blog focused markups” also exist too.

]]>
Manjaro Followup - Breaking things! https://www.senders.io/blog/2021-01-05/ https://www.senders.io/blog/2021-01-05/index.html Tue, 05 Jan 2021 00:00:00 -0500

Manjaro Follow-up - Breaking things!

I wanted to write a quick follow-up covering how I managed to break, and then recover, everything when I went to remove my old debian partition.

Recap

To recap: I installed Manjaro alongside a Debian/sid and Windows 10 install. Each of those OSs were on their own SSDs. I went from a 128SSD with Windows installed, to adding a 256 installing Debian. Years later I split the Debian SSD into two parts - installing Manjaro on my new slice. Since my last update I have been playing around with Manjaro and having made my i3 keybindings for Kwin I've been pretty happy. But then I started breaking things.

Break stuff

I broke my Manjaro by updating my Debian (apparently). To be honest this is the one part I don't fully understand why it happened. From what I could find online I didn't setup my system to handle two separate Linux OS installs. But I was no longer able to boot directly into Manjaro without using the initramfs failover boot option. I only updated my Debian install because I was debugging something on my work install, which both run Debian/sid. (Otherwise I would've used my server which runs Debian/Stable). But considering I hadn't had any need to boot back into Debian I decided to just get rid of it!

GParted, Grub, Gotchas!

I went in knowing I'd have to fix my Grub since I'd be removing Debian, which was the OS that I configured when I first dualbooted the machine, so I assumed they were linked somehow and I would need to reinstall it. The process I followed was:

  • Create a GParted Live USB
  • Launch GParted reconfigure my partitions
  • Open the terminal in the live USB and reinstall Grub
The 3rd point being a bit of a "rest of the owl" I wasn't sure what to expect. GParted thankfully warns you "you're probably going to break stuff see our FAQ" which had a section on reinstalling grub. Reading that the 3rd part became:
  • mount the linux OS
  • bind the live dirs that are needed: /dir /sys /proc
  • chroot into the mounted folder
  • run grub-install <device>
But what I failed to realize (stupidly in hindsight) was the "device" is the Master Boot Record (MBR) device. So in my case Windows or /dev/sdb. I had assumed it was the device of the linux install so I tried that and got notified my EFI boot directory didn't look like an EFI partition... and from here it was rabbit holes.

Where is my EFI partition?

I have a fairly old Windows 7 install that has been upgraded to Windows 10 during this whole journey. I've been meaning to reinstall it (on a larger drive). But rather than having a few partitions on my drive (typically having a boot partition) I just have the one (and a recovery partition). Its marked as boot, and even mounted to /boot/efi I found when I was able to boot into Manjaro again. But it made no sense to me. If I needed an EFI partition, why was my efi pointed to the root of my Windows C drive? The rabbit hole consisted of:

  • Creating a 200MB Fat32 Boot partition
  • Mounting that as my efi-directory
  • Reinstalling grub (again on my Linux device)
  • Eventually getting it to boot straight into Manjaro
  • Modifying my /etc/fstab to mount my boot/efi to the new partition (oops)
  • Repeating the above steps 5 times hoping something would be different
  • Eventually finding in a forum that grub should be on the MBR...

The Fix and Final Steps

The fix was to basically follow the steps above but use the MBR:

  • Boot GParted Live USB
  • Properly configure any partitions (this case delete the "EFI" partition)
  • Mount the linux device
  • Bind the necessary live dirs to the linux mount
  • Run grub-install to the MBR device
  • Reboot
It was that misunderstanding about the MBR that sent me on a path, but now I at least feel semi-confident in changing around my OSs knowing how to fix Grub. But what bout the Fstab?

Like all true movie monsters, my stupidity came back for the final scare. I booted into Manjaro, from Grub! to have it crash on me. It couldn't mount one of the devices! The deleted partition! I was in the recover shell and was able to modify the Fstab to point back to the correct boot/efi device. (Thankfully I was familiar with Fstab to begin with). But editing two files in a super-low-res terminal is not my idea of fun (okay, maybe it is).

Conclusion

One of my new years resolutions was to learn more about my system. So lighting a fire I had to put out was a great way to get some more knowledge on maintence for grub/dualbooting.

]]>
Manjaro Experiment https://www.senders.io/blog/2020-12-17/ https://www.senders.io/blog/2020-12-17/index.html Thu, 17 Dec 2020 00:00:00 -0500

Manjaro Experiment

After years on Debian, running i3, I decided to try out a more traditional Linux setup, and take a stab at gaming on Linux. I chose Manjaro for a few reasons:

  • It's not Debian based (it's arch btw /s)
  • It's still on Systemd so I won't lose that familiarity
  • For gaming it comes with pretty up to date drivers and setup for running Steam games
  • It has a KDE installation which is what I wanted to run

Why "not Debian"

Debian is home for me. I have used it for years on both work machines, servers, personal desktop. But it comes with its own quirks. Starters - I am running base Debian, not a Debian based system, which generally means some packages are out of date. To get around this I run Sid/Unstable. This hasn't been a particular issue, but sometimes there are version conflicts and other just nuisances and no real easy way to get every package in the proper version configuration. This was a particular pain-point with getting Steam (nonfree too which adds another layer of configurations) Wine and a few other packages all set up. Plus 32-bit!

i3

I have been using i3 as my window manager and without really any other desktop environment programs. My login is the typical tty debian login. But running i3 and then having windows appear, especially game windows which can be tempermental, getting tiled to have to break it out again is just a hassle. While I could've gone with another Debian base running a proper desktop environment + window manager I figured that'd be boring and I'd just be trying out the programs and not the Linux, which is half the fun.

That being said. i3 is Linux for me. Being able to just move between windows with a macro and every bit of it just being intutive (after you've learned!) is a productivity booster. Which is why I still use it on my work machine, and can't see myself ever switching off.

KDE

I've used Gnome and XFCE as desktop environments before, and they're fine, but I've always like the customability, flexibility, and polished look of KDE.

Setting up KDE for an i3 addict

By default KDE isn't really too hard to "get used to" since it feels like any other OS, especially a windows setup. But the main thing I needed to change is the meta+<key> commands.

  • Remapping the Virtual Desktop changes
  • Remapping the KWin window focuses
  • Remapping the KWin move to desktop
  • Installing DMenu
  • Shrinking the "start bar" panel
  • Removing Pager
  • Changing Task Manger to Window List
  • Configuring Desktop Layout to "Desktop" (this removes the icons)
Doing this helped make me feel at home so far, and not have to retrain my brain.

Some of the key remappings

Setting up the KWin window keymapping was really what made me feel at home. For the first few hours with it, I felt as limited in my productivity as with Windows. KDE and Windows share by default a lot of the same keymappings around window manipulation and virtual desktop changes. Switch to desktop N setting this as meta+<N> where N is the dekstop 1-10 (0). Switch to Window to the Left/Right/Up/Down This was one I was nervous wouldn't exist as a keybind. But What was meta+alt+<dir> was mapped to without the alt. This allowed for the very annoying lack of ability to just jump between browser and terminal, or especially two separate terminals. Quit Window with meta+shift+Q, Tile Window command to use the Shift key rather, especially as meta+<dir> was overwritten by the focus switching.

Manjaro

So I went with KDE Manjaro. Manjaro aims for the gaming desktop experience. Arch is new for me, so I feel that would be something to adjust to and learn.

Gaming

It has only been a day with it as I am writing. But I was able to get a fair amount of the fighting games I wanted to play work.

Proton + Steam

So far my main focus has been running the fighting games I noodle around on in Steam. To do this I launched Steam and installed the proton and setup to run all games, regardless of compatibility. None of the games I hoped to run had worked this way. I then opt'd into the beta for Proton running the experimental builds, which should generally have the more up-to-date tunings for games. With this setup I was able to get Soulcalibur VI to work. Battle for the Grid and Dragon Ball FighterZ both had launching issues. So I looked around and found Proton Ge Custom which is a custom fork of Proton that contains custom settings and tweeks for various games. One of which is Battle For the Grid which is how I found it. Using this I was able to play every game except Dragon Ball FighterZ! A callout for Dead or Alive 6 which is performing questionably. It can run and isn't actually too bad, but in windowed or borderless it stutters and drops frames.

Other issues

Even on Windows there are issues with some games and your standard configurations. Disabling Steam Overlay and adjusting the Steam Input Setting on some games helped get some games working.

Conclusion

Gaming on Linux is still not great. Its MILES ahead of where it was even a few years ago when I setup this PC. And I think it will take some adjustment getting a feel for an i3less workflow.

Update!

NTFS mounting

Update! I got DOA and a few other games to run a bit smoother by remounting my NTFS drives properly. I ended up using the following for my /etc/fstab configuraiton for my NTFS drives: UUID=<drive-id> /mount/path ntfs uid=1000,gid=1000,rw,user,exec,async,locale=en_US.utf8,umask=000 0 0 I had noticed that both steam and mount.ntfs was running at 20-40% CPU while not really doing anything. And then upwards of 80% during gameplay.

i3 Compatibility

As I spend more time using the OS I made a few more adjustments:

  • Removed everything except the Clock and System Tray.
  • I added KRunner to meta+space to ease running KDE specific programs that I can't be bothered to memorize the name of
  • Back and forth on forcing "No border" on all windows. Part of the reason I moved away from i3 was so that I had better floating window management. And doing this would basically put me in an equally hard to manage system for floating game windows. So until I find a plugin that makes small taskbar/borders for the windows I'll be sticking with the default.
  • On Manjaro at least: UNINSTALL mesa-demos! sudo pacman -R lib32mesa-demos mesa-demos This package had the annoying "fire" demo which made dmenu opening firefox a pain in the ass.
The biggest difference was removing the Application Launcher from the main panel. Having it there really felt like a crutch for running programs. It is equal I would say to running apps as dmenu via meta+d vs just meta to launch the Application Launcher. However, the bulky UI of it, even using just Window List, took away from the look/feel I was going for. ]]>
Bread Blog (First post) https://www.senders.io/blog/bread/ https://www.senders.io/blog/bread/index.html Mon, 17 Feb 2020 00:00:00 -0500

Bread

I decided to make a singular dedicated page to my recent bread bakes. I am trying to at least keep a log of each bake, what went wrong/right in hopes of nailing a recipe that works best for me.

February 17, 2020

First post! I have done four bakes in 2020 that are worth mentioning. Three that ended up rather successful and one lesson learned. Because this is my first post its containing three very similar bakes that were effectively the same recipe

Boules

I have made two very good boules in 2020. I first made a pate fermentee using the following ratio using 50% of my total flour weight: (500g, so 250g).

Pate Fermentee
Item %
Flour (Bread) 100%
Water (Room temp) 70%
Yeast (Instant) 0.55%
Salt 10%
To make the pate, I mixed all the dry ingredients together, then added the room temperature water. I let that loose mixture rest for 15 minutes. Once it was rested, I wet my hands and bench (lightly) and kneaded for roughly 8 minutes. After kneading I tightened the dough into a boule and let it sit in a plastic wrap covered greased bowl for an hour. After an hour I placed it into the friged, as is.

The next day, basically in the AM when I had time to bake I took the dough out of the fridge, cut it into smaller bits (four), and let it come to room temperature (ish, about an hour). I prepped the same ratio above except with warmer water (~108°F). When I added the water to the dry ingredients I added the pate along with it. I used the curved edge of my scrapper to cut into the pate and incorporate it fully. Once I felt it was all one loose mess I let it sit for 15 minutes. After the 15 minutes I wet my hands, and bench, and began to knead the dough for 8 minutes. After kneading I formed the dough into a boule and placed it into a greased bowl covered in plastic wrap. I let that sit on my bench for 90 minutes or so. After the first proof I dampened my bench and took the risen dough out of the bowl and lightly pressed it into a thick circle. I then took the, what would be, corners of the mass and folded them into the center, rotating after each fold. This process creates a boule shape while creating tension. I would continue to do this about 8-10 times really until it felt like I couldn't grab anymore/it wouldn't stick. Then I flipped the dough over and tightened the boule in a scooping motion as I rotated it. Then placed it into my floured banneton. I let it rise again for about 45 minutes. Around the 30 minute mark I would preheat my oven to 500°F. Once the oven was preheated and its been at least 45 minutes. I flipped out the dough onto the peel (dusted with corn flour) and scored it. I then misted the top with a spray bottle of water and slid it onto my baking stone. While preheating the oven I also set a kettle to boil some water which I poured into the preheating baking sheet on the bottom rack. I set the timer for 10 minutes and every two minutes or so I would add more boiling water. After 6 minutes I rotated the dough using the peel (careful not to damage it). And misted the facing side with the spray bottle (I found the back is lighter so this helps make the entire steaming more even). After the turn and mist I add twenty minutes to my timer and drop the temperature to 450°F.

This produces a nice, well risen boule with a golden brown crust.

I skipped the pate in my most recent bake and just did 100% (500g) starting from "day 2". I also subtituted 100g with AP flour.

Baguettes

I actually did the boule recipe first for my baguettes. I did aiming for 1000g flour so my pate was with 500g and a 50/50 AP/Bread mix. I screwed up the ratio for yeast and added almost double. The recipe is essentially the same with the final steps being the difference.

After the first proof I sliced the dough into three chunks. Then I formed those into boules and let them sit for 5 minutes. After resting I then rolled them into batards and let them sit for 10 minutes. After 10 minutes I then rolled them into baguettes and placed them on the baguette sheet. And then baked them. After letting them rise for 45 or so minutes.

Accidents

Baguette rolling is hard. And I need to let the dough rest longer between each shape.

1000g for three ~15 inch baguettes is too much. I would do 750g next time.

Proofing on the sheet is not recommended in the future as the rose really well (probably all that extra yeast!) and ended up sticking together.

I broke my oven light with my spray bottle. And I ruined my cast irons seasoning usnig that for the boiling water.

What to do next time

Next french style boule, I want to do a pate again. As I've only done it for one boule loaf. And I want to try making two loafs from it.

Resources

Bake With Jack's Youtube Channel really helped me shape up my shaping up. And the core of the pate+french bread recipe is based on that from The Bread Baker's Apprentice

]]>
remember/recall - what could’ve been a command line tool https://www.senders.io/blog/2020-01-13/ https://www.senders.io/blog/2020-01-13/index.html Mon, 13 Jan 2020 00:00:00 -0500

remember/recall - what could've been a command line tool

During a meeting at work when I realized I often forget useful commands. So I had the bright idea to create a command line tool that would basically append a file with the command you wanted to remember that you could search over later if you wanted to recall a certain command. I figured I could it could just be a simple bash script that recalls your bash-history and appends it to a file, all things that are incredibly easy to do... or so I thought.

Look before you leap

This article is a reminder to myself to test the core functionality first, before decorating your program/script with all those bells and whistles. While I did learn a lot in the process it is always a good to check the basics first.

What went right

I actually ended up learning a lot during the development of the (never finished) tool. I had never used getopts inside a script before, which turned out to be extremely intuitive. That was all that went right...

What went wrong

Literally, everything else that could've went wrong did. The "project" was a single bash script roughly 160 lines long before I found out it wouldn't work. It was a series of flags that enabled actions that called functions, some of which ended the script either successfully or not. It wasn't necessarily a mess to read (I tried to make it that every function ended up in an exit so I knew if I entered I would need to assume it terminated) but it was hard to follow when writing. I tried to allow it so you could default an action to make the CLI intuitive which lead to a messy set of if/elses and switch cases.

You can't access un-committed bash history

History command in a bash shell commits the history at the end of the session. This makes sense once you know this, there are a lot of reasons saving the commands to file after every execution is probably not the best idea. However, it can be enabled with a flag when you enable a shell session. But I didn't want to build a tool that required me to remember I had to add something to my bash_profile before it would work. I wanted something I could just copy onto a new machine and have access to its functionality.

Lesson learned

While developing a tool to help me remember things, I learned something I cannot forget: Test the core, simplest functionality first. Before you do anything validate what you're trying to do will work. Because after building all of these fancy bells and whistles, if it can't do the basics, there is no point.

]]>
Lisps, Assembly, C, and Conlangs https://www.senders.io/blog/2019-12-09/ https://www.senders.io/blog/2019-12-09/index.html Mon, 09 Dec 2019 00:00:00 -0500

Lisps, Assembly, C, and Conlangs

I had originally hoped to do more blogging as a way of practicing my writing and an incentive to do more hobby programming. The intent was never to make this site solely programming, I had actually a few scrapped posts about baking and guitar that just didn't get anywhere... but that being said I did have a fair amount of hobbying in 2019 that I can share some unfiltered, semi-structured thoughts on.

Racket, 80x86, and even more C

Racket

Racket is a general-purpose lisp-like language. I had began messing around in it with the intention of creating a similar language to Scribble a document authoring language written in Racket. I made the classic mistake of trying to create a productivity tool rather than just do the task I had originally intended to do. It was interesting messing around in a lisp/functional language which I haven't really used in a long time. I wish I had more insightful things to say about it or project to share. Either way its very worth the look.

6502 -> 80x86 -> Commander X16

I wanted to play around with writing some assembly language programs. I looked back at the NES tutorials and tried writing some basic hello-world programs for it, but never really came out with anything worth while. I booted up dosbox and tried experimenting in some DOS programming to get a kick of nostalgia. On my way over to a friends apartment I stumbled across an 80x86 reference book which I took home and dug into. I made some decent progress in, relative to my 6502 learning. But this was in the summer, and I was preparing for what would turn into a pretty time consuming move. After my move, my puppy, and some youtube, The 8-Bit Guy made a video about his 8 Bit computer project Commander X16 which I started looking into. Like all the other assembly language projects they never amounted to more than a few print statements or colors on the screen. But X16 is something I am going to keep an eye on in 2020.
Ben Eater also started a 6502 video series which was amazing, and thankfully my learnings from earlier in the year made the content very understandable. In summary, I spent a lot of 2019 reading and watching a lot of content about assembly language programming, but never really did anything with it.

Never ending C

Without much to really say on the topic, I kept writing small programs in C throughout the year. I spent a lot of time debugging and troubleshooting a prefix terminal calculator with the intention of making it a full utility to use on the command line / from within scripts. You could do simple math without opening up x-calc, which I find myself doing to check some quick math. Example code: calc "+ 1 1". To me this was far cleaner than writing: echo $((1+1)). The big ideas I had for it was adding a REPL and making it a command line calculator tool where you could get the features of a standard calculator with store and recall functions. This project involved making two stacks: the operations and the numbers. Implementing two stacks from scratch was interesting and I may upload the source and link it in an update. Overall it was full of breaks, bugs, wrong turns, and bizarre memory issues. So needless to say it was a fun 3 days of programming.

Non Programming Writing

The project that soaked up a majority of my writing time, which sadly should've been documented here, was my conlang / world-building project "Tyur". This project spawned out of sci-fi story ideas that, of course, never went anywhere (due to my poor dialog writing, and writing in general) and my interest in language history. I have been reading The Horse the Wheel and Language by David W. Anthony, which goes into the history around Proto-Indo-European. It can be a bit dense so I had been reading it on and off, and during the off times also started The Origins of Language: A Slim Guide by James R. Hurford, which tries to provide insights on the evolutionary concept of language. Both of these provided some fodder for the idea of creating my own conlang. My conlang is "Tyur" the language spoken by the Tyur people. This process has really been a mix of world-building around the Tyur and some fun fantasy mini story ideas similar to The Lord of the Rings and old Warhammer Fantasy worlds. This however began my adventure down the rabbit hole of trying to figure out how to create a font so I can write more here about it. The documentation on this conlang is a mix of loose-leaf folded in my bag that I scribble on when I get an idea. So figuring out a proper way of building the alphabet and some root words to start a dictionary are my current goals for the remainder of the year/ start of 2020.

Closing

In closing, I think despite not writing much here, I messed around with some interesting languages this year, and hope I can hobby more in 2020.

]]>
Venturing back into C https://www.senders.io/blog/2019-02-17/ https://www.senders.io/blog/2019-02-17/index.html Sun, 17 Feb 2019 00:00:00 -0500

Venturing back into C

For the past two weeks or so I have been diving back into C programming. I've found it to be a very fun and refreshing experience coming off of a slog of Java 11 updates at work. I've found comfort in its simplicity and frustrations in my "I can do this without an IDE" mindset.

I started C programming in College during a 8 AM course of which all I can remember is that it was at 8 AM. I loved programming in C, dealing with memory, pointers, no strings, structs, no strings, linking, no strings. It was a really interesting difference from the web and Java programming I had done previously. Obviously the lack of the "string" type made things interesting and initially a challenge for me back then. In my most recent endevour I found char * to be perfectly suitable for every case I came across. It was usually a separate library that was failing me, not a fixed char array. This was mostly due to the types of programs I was writting in college were text adventures where all of what I did was using strings. And my lack of understanding of what was actually happening in C was really what was causing all the issues.

The Project

I started working on an application I had been meaning to develop called reminder.d. This daemon would monitor for reminder notifications I would send via a CLI. It queue them up based on some time set to send the notification. I ended up writing both the CLI and the daemon in this past week, both in C.

The Beginning

This project started with an outline (as a README) which I think was the reason this ended up as an actually successful project. I had been thinking about this for a long time, and had begun using a calendar to keep track of long term reminders/dates etc. First, I outlined the architecture "how would I actually do want to send myself remidners". Since half my day is spent infront of a computer, with a terminal open or at least two keystrokes away, a CLI would do the trick. Then how do I actually send myself notifications... writing them down. So I can use the CLI to write to a file and have a daemon pick up the changes and notify me once it hits the desired time posted.

The CLI

The CLI remindme took in messages and appened them to a file. This file would be monitored by the daemon later on. Each reminder consisted of three parts:

  • Message - The body of the notification.
  • Time - This is either a datetime or a period for when the notification should send.
  • Flag - The Flag was set by the CLI when written to the file, this marks the status of the reminder
After a notification is written the daemon will pick up the notification and notify if the time set is now/past.

The Daemon

The Daemon reminder-daemon opened and tailed a file at /usr/local/etc/reminder.d/$USER.list. It would tail the file monitoring any incoming lines parsing them into reminders. The syntax of the reminder is FLAG EPOCHSEC MESSAGE . Tokenizing on spaces it was then added to a linked-list sorted by time. Every second it checks the file for any new lines, adding reminders as they come in, then check the head of the list. If the reminder at the head is ready to be notified the daemon pops it off the list and sends the notification. After a notification is sent successfully the daemon modifies that line in file updating its FLAG to 'd'. This is so when the daemon starts back up it skips the reminder. Notifications are sent via libnotify: Reminder - $DATETIME with the message body. They are also set to last until dismissed manually, this way if were to walk away, once I sat down I'd see the stale reminder waiting.

Future Plans for Reminder.d

Having a system to create and send myself notifications is incredibly useful but having them limit to just the computer I sent them on makes them a very limited. I have been using them at work for the last few days and its nice to be able to tell myself to remeber to email a person after lunch. But I would like to be able to tell myself things later in the day. I have planned since the beginning to have a remote server I can sync the reminders through. In addition having an application running on my phone that also gets and sets reminders.

Remote syncing would change entirely how I deal with reminders in the file.


 struct remnode { 
   long fileptr; 
   struct reminder* reminder; 
   struct remnode* next; 
 }; 
      

Is currently the struct I use to keep track of the reminders. fileptr is the line of the file where the reminder is, so I can fseek back to the location and overwrite its flag. I cannot currently think of a way to keep the files perfectly identical without introducing countless edgecases. What I do think might work is providing some form of UUID. When a remote pull tells the systems daemon that a notification has been cleared it can mark it by ID. Right now the fileptr is effectively its ID, but that will not work anymore. A composite key of the daemons own id (generated at install?) with a new ID of each incoming message would help ensure uniqueness across ID generations across multiple systems.

What I've learned

First off, I probably could've done this in bash. With date notify-send git awk cron and a few other useful commands I could very easily keep track of file changes and push notifications at a certain time. But seeing as I scrap together bash scripts all the time I though C would make things more fun.

Writing manpages was the probably the most fun I had working on the project. They have a simple elegance to them, similar to C. That being said you could FEEL the age of the language. Every single decision is there to make things simple to parse. Even compared to modern markup the explicit direct nature of the language made it so easy to learn. Every tag served a specific purpose and each objective I had had a flag to do it.


.TH REMINDME 1 
.SH NAME
 remindme \- Send yourself reminders at a specific time on one or more devices
.SH SYNOPSIS
.B remindme
[\fB\-t\fR \fITIME\fR]
[\fB\-\-at \fITIME\fR]
[\fB\-i\fR \fIPERIOD\fR]
[\fB\-\-in\fR \fIPERIOD\fR]
        
      

Libnotify was insanely easy to work with, from a programming perspective.


  NotifyNotification *notif = notify_notification_new(title, rem->message, "info");
  notify_notification_set_app_name(notif, APP_NAME);
  notify_notification_set_timeout(notif, NOTIFY_EXPIRES_NEVER);

  GError* error = NULL;
  gboolean shown = notify_notification_show(notif, &error);
        
      

In closing

Overall, this was an extremely fun first week of engineering. I look forward to what I am able to do syncing and sending notifications on android.

For the zero people reading, grab a beer and outline your project. Full through. Think about the how, then write it down. Don't worry about getting in the weeds of how to write a manfile, thats what is fun about programming. I thought I botched my debian/sid environment uninstalling and reinstalling a notification daemon. Infact I think its caused me to take a stance on the whole systemd thing. Either way, start a private repo (they're free now) write a README and a LICENSE file and iterate on the README until you realize "oh shit this is something I can do". Then do it. This project still needs some work, but for an MVP, its actually done. And now I can dive in the deep end of trying to actually make it easy to setup on a fresh PC. Or dive into modern android development and server syncing...

]]>
First! A New Years Resolution https://www.senders.io/blog/2019-01-21/ https://www.senders.io/blog/2019-01-21/index.html Mon, 21 Jan 2019 00:00:00 -0500

First! A New Years Resolution

I like to write small hacky things from time to time when I have a weekend to myself, or a day, or an hour... But I never had a place to put them or the push to complete them beyond their initial hack. So I decided I should write a blog about it.

Also for work I had to write some prose about myself, something beyond a technical document or RFC and I realized I am shit at writing my thoughts outside of a very direct specific technical way.

I am not sure if it is the age of the internet I grew up in where most of my written communication was informal or for school. But my personal writing skills are trash and this is my attempt to kill all the birds with one stone

What can be expected here

My intentions for this site beyond just a landing page with my resume, I hope to upload some code-snippets from things I found interesting, ideally some recordings, drawings, and model-painting.

How often do I intend to update this blog

Ideally, whenever I have something that I feel is worth sharing. But for the sake of my resolution I want to do at least one post a month, and if I am keeping my other resolutions I should have content to put here

Designing my site

Designing this blog actually took way more time than it should have. It began when I wanted to tackle a javascriptless website. And I found that a bit difficult if I wanted to have code with syntax highlighting. So I wrote a python script to generate <pre> tag wrapping Java code with partial syntax highlighting.Possibly mistaking highlight.js usage documentation. But I would like to prevent having javascript on my main website keeping it as simplistic as possible.

I test the site using both tidy and nginx via docker. Using tidy I can validate the html (making sure I didn't miss any tags etc) and tidy up any odd spacing. And then visually test it running nginx. Having it served up similarly to s3 all the paths will work, and is insanely easy to setup! If you're reading this and have anything beyond a simple html file I recommend running docker + nginx over any javascript server.

Then I deploy the site through s3-cli Which is simple and to the point.

In Closing

I wanted to include more but I ran out of time today to write more, I will probably update this article with more information (and an updated timestamp). Or just make another post of my code highlighting task.

]]>